Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Human-centered computing
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Center for Cognitive Ubiquitous Computing (CUbiC) at Arizona State University === [[File:Note-Taker device and David Hayden.jpg|thumb|Note-Taker device with initial inventor David Hayden]] Based on the principles of human-centered computing, the Center for Cognitive Ubiquitous Computing (CUbiC)<ref>{{cite web |url=https://cubic.asu.edu/ |access-date=28 December 2018|title=Home | Center for Cognitive Ubiquitous Computing}}</ref> at [[Arizona State University]] develops assistive, rehabilitative and healthcare applications. Founded by [[Sethuraman Panchanathan]] in 2001, CUbiC research spans three main areas of multimedia computing: sensing and processing, recognition and learning, and interaction and delivery. CUbiC places an emphasis on transdisciplinary research and positions individuals at the center of technology design and development. Examples of such technologies include the Note-Taker,<ref>{{cite news |last1=Kullman |first1=Joe |title=Note-Taker device promises to help students overcome visual impairments |url=https://asunow.asu.edu/content/note-taker-device-promises-help-students-overcome-visual-impairments |access-date=28 December 2018 |publisher=ASU Now |date=23 August 2011}}</ref> a device designed to aid students with low vision to follow classroom instruction and take notes, and VibroGlove,<ref>{{cite web |last1=Panchanathan |first1=Sethuraman |last2=Krishna |first2=Sreekar |last3=Bala |first3=Shantanu |title=VibroGlove |url=https://cubic.asu.edu/content/vibroglove |website=CUbiC.asu.edu |access-date=28 December 2018}}</ref> which conveys facial expressions via haptic feedback to people with visual impairments. In 2016, researchers at CUbiC introduced "Person-Centered Multimedia Computing",<ref>{{cite journal |last1=Panchanathan |first1=S. |last2=Chakraborty |first2=S. |last3=McDaniel |first3=T. |last4=Tadayon |first4=R. |title=Person-Centered Multimedia Computing: A New Paradigm Inspired by Assistive and Rehabilitative Applications |journal=IEEE MultiMedia |date=July–September 2016 |volume=23 |issue=3 |pages=12–19 |doi=10.1109/MMUL.2016.51 }}</ref> a new paradigm adjacent to HCC, which aims to understand a user's needs, preferences, and mannerisms including cognitive abilities and skills to design ego-centric technologies. Person-centered multimedia computing stresses the multimedia analysis and interaction facets of HCC to create technologies that can adapt to new users despite being designed for an individual.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)