Definition - What does Ubiquitous Computing mean?
Ubiquitous computing is a paradigm in which the processing of information is linked with each activity or object as encountered. It involves connecting electronic devices, including embedding microprocessors to communicate information. Devices that use ubiquitous computing have constant availability and are completely connected.
Ubiquitous computing focuses on learning by removing the complexity of computing and increases efficiency while using computing for different daily activities.
Ubiquitous computing is also known as pervasive computing, everyware and ambient intelligence.
Techopedia explains Ubiquitous Computing
The main focus of ubiquitous computing is the creation of smart products that are connected, making communication and the exchange of data easier and less obtrusive.
Key features of ubiquitous computing include:
Consideration of the human factor and placing of the paradigm in a human, rather than computing, environment
Use of inexpensive processors, thereby reducing memory and storage requirements
Capturing of real-time attributes
Totally connected and constantly available computing devices
Focus on many-to-many relationships, instead of one-to-one, many-to-one or one-to-many in the environment, along with the idea of technology, which is constantly present
Includes local/global, social/personal, public/private and invisible/visible features and considers knowledge creation, as well as information dissemination
Relies on converging Internet, wireless technology and advanced electronics
Increased surveillance and possible restriction and interference in user privacies, as the digital devices are wearable and constantly connected
As technology progresses, the reliability factor of the different equipment used may be impacted
Join thousands of others with our weekly newsletter
Free Whitepaper: The Path to Hybrid Cloud:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: