Definition - What does Collaborative Computing mean?
Collaborative computing is described as a phenomenon where modern technology tools facilitate and enhance group work that exists through distributed technology – where individuals collaborate from remote locations.
Techopedia explains Collaborative Computing
Many different types of modern tools and technologies constitute collaborative programming resources. Some of the earliest systems focused on how to allow groups in distributed locations to view files, share information and chat amongst themselves in order to complete projects. As collaborative computing and general technology evolved, videoconferencing and multi-feature conferencing programs upped the ante in providing sophisticated platforms where remote teams could complete tasks like content management, or work on the full “life cycle” for a product or service.
Collaborative computing tools really run the gamut – Google Hangouts could be called “collaborative computing.” Some of the proprietary platforms that remote teams use to deliver graphic design or copy projects could also be called collaborative computing tools. It is important to note that while early collaborative computing technologies focused on bringing together people in different places, many of today's tools focus more on streamlining and organizing the collaborative work of large groups of people who may actually be within the same business campus or other location.
Much of the modern collaborative computing infrastructure offered to companies involves cutting down on face time, and replacing face-to-face meetings and interactions with digital ones. Collaborative computing can serve a business in many different ways, according to its footprint and operational needs.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: