“Aging in Place” is a phrase that I come across on a regular basis and appears to be the ‘in’ phrase right now for the 55+ segment of the population. What does it mean to you? How would you define it? According to Wikipedia, it is: “The ability to live in one’s own home and community safely, independently, and comfortably, regardless of age, income, or ability level.” If one is a younger senior or a baby boomer (one born between 1946-1964) there is still time to make this decision, to save money and to plan for the ‘golden years’ although many may continue to work beyond the typical retirement age of 65-67.
From everything I have heard and read, most adults would like to stay in their own home until they are no longer able or until they die, but is this realistic? Is your current home modified for future needs, are renovations needed or are you in the market for an already modified home? An existing home needs to have a bedroom, bathroom, kitchen and laundry room on the main floor. The market for purchasing a new home or an already modified one: all is one level as in ground floor only with the main features of the house accommodating the needs of an older senior or one with a physical disability; wider doorways, no entranceway stairs, walk in showers with a built-in bench, pull-down or adjustable shower head, non-slippery floor surfaces throughout the home, lower cabinets, easy and accessible closet/storage space, etc.
Home maintenance and finances will be huge factors in the decision process as well. Are amenities close by, public or alternative transportation, medical care, family support, private support/services, socialization opportunities, and most importantly is it affordable? It could cost quite a sum of money to renovate/modify an existing house. Does it make more sense to buy a house that is already modified whether it is a stand alone house or one that is part of a community? These are all questions to consider before making a decision. What this means is a shift from a youth driven market to the aging market in terms of housing.
If boomers think about aging in place at all, we usually regard it as something we can put off until much later in life. Or to be more accurate, it’s something we hope we can put off until much later. That’s because we associate these kinds of modifications with growing old, which doesn’t sound like much fun. There’s a better way to think about it, though: Aging in place is about creating a home so beautiful, comfortable and expressive of your personality that you never want to leave. We can’t fight aging, but we can take steps to make our house the place we want it to be.
A reframing of aging itself is needed in order for the phrase ‘aging in place’ to become acceptable. Aging is inevitable, but factors of health, well-being both physical and mental, social opportunities, accessibility to needed and necessary support/services/healthcare and finances all play a part in how one ages.
I also came across the Kendall Northern Ohio Blog which has an article on: Aging in Place: Look for Universal Design Features which lists some very practical advice for those looking to relocate into a home that has not been designed in the ‘aging in place’ style, but instead already exists and has universal features built-in that will accommodate someone with a physical disability or a senior with physical/health limitations.
No-Step Entry
French Doors
Wide Interior Doors
Open Space
Downstairs Bathroom/Bedroom
Low Shelving and Sinks
Smooth Shower Entry
East Turn Faucets/Fixtures.
Easy Twist Doorknobs
Grab Bars
Baby-boomers do not want to age in a typical senior residence or nursing home that current ‘older’ seniors have chosen or been placed in due to health conditions, diseases, physical limitations or cognitive disorders. The larger a cohort is, the more influence it can have on the market and this is already being seen. Baby-boomers are well-educated, demanding and research information as needed to expand their knowledge. Demands will be met or businesses lose out financially. It will be interesting to watch this market grow.
Written by Victoria Brewster, MSW SJS Staff Writer
An easy-to-use EEG cap could expand the number of ways to interact with your mobile devices.
One day, we may be able to check e-mail or call a friend without ever touching a screen or even speaking to a disembodied helper. Samsung is researching how to bring mind control to its mobile devices with the hope of developing ways for people with mobility impairments to connect to the world. The ultimate goal of the project, say researchers in the company’s Emerging Technology Lab, is to broaden the ways in which all people can interact with devices.
In collaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas, Samsung researchers are testing how people can use their thoughts to launch an application, select a contact, select a song from a playlist, or power up or down a Samsung Galaxy Note 10.1. While Samsung has no immediate plans to offer a brain-controlled phone, the early-stage research, which involves a cap studded with EEG-monitoring electrodes, shows how a brain-computer interface could help people with mobility issues complete tasks that would otherwise be impossible.
Brain-computer interfaces that monitor brainwaves through EEG have already made their way to the market. NeuroSky’s headset uses EEG readings as well as electromyography to pick up signals about a person’s level of concentration to control toys and games (see “Next-Generation Toys Read Brain Waves, May Help Kids Focus”). Emotiv Systems sells a headset that reads EEG and facial expression to enhance the experience of gaming (see “Mind-Reading Game Controller”).
To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency. Robert Jacob, a human-computer interaction researcher at Tufts University, says the project fits into a broader effort by researchers to find more ways for communicating with small devices like smartphones. “This is one of the ways to expand the type of input you can have and still stick the phone in the pocket,” he says.
Finding new ways to interact with mobile devices has driven the project, says Insoo Kim, Samsung’s lead researcher. “Several years ago, a small keypad was the only input modality to control the phone, but nowadays the user can use voice, touch, gesture, and eye movement to control and interact with mobile devices,” says Kim. “Adding more input modalities will provide us with more convenient and richer ways of interacting with mobile devices.”
Still, it will take considerable research for a brain-computer interface to become a new way of interacting with smartphones, says Kim. The initial focus for the team was to develop signal processing methods that could extract the right information to control a device from weak and noisy EEG signals, and to get those methods to work on a mobile device.
Jafari’s research is addressing another challenge—developing more convenient EEG sensors. Classic EEG systems have gel or wet contact electrodes, which means a bit of liquid material has to come between a person’s scalp and the sensor. “Depending on how many electrodes you have, this can take up to 45 minutes to set up, and the system is uncomfortable,” says Jafari. His sensors, however, do not require a liquid bridge and take about 10 seconds to set up, he says. But they still require the user to wear a cap covered with wires.
The concept of a dry EEG is not new, and it can carry the drawback of lower signal quality, but Jafari says his group is improving the system’s processing of brain signals. Ultimately, if reliable EEG contacts were convenient to use and slimmed down, a brain-controlled device could look like “a cap that people wear all day long,” says Jafari.
Kim says the speed with which a user of the EEG-control system can control the tablet depends on the user. In the team’s limited experiments, users could, on average, make a selection once every five seconds with an accuracy ranging from 80 to 95 percent.
“It is nearly impossible to accurately predict what the future might bring,” says Kim, “but given the broad support for initiatives such as the U.S. BRAIN initiative, improvements in man-machine interfaces seem inevitable” (see “Interview with BRAIN Project Pioneer: Miyoung Chun”).
If you have never checked out any TED talks before... now is the time to start. I promise you that you will be hooked!
TED's mission statement begins: We believe passionately in the power of ideas to change attitudes, lives and ultimately, the world. So we're building here a clearinghouse that offers free knowledge and inspiration from the world's most inspired thinkers, and also a community of curious souls to engage with ideas and each other...
TED stands for Technology, Entertainment and Design. Their talks are dedicated to disseminating "ideas worth spreading". Take a break from all the reality shows, sitcoms and dramas and watch a TED talks instead with your loved ones. You will be moved, enlightened, informed and inspired! Here are just some of our favorite TED talks dedicated to assistive technology and/or disability-related. Have you seen a talk you would like to share? Put it in our comment box - and enjoy! 1. Sue Austin: Deep sea diving … in a wheelchair
When Sue Austin got a power chair 16 years ago, she felt a tremendous sense of freedom -- yet others looked at her as though she had lost something. In her art, she aims to convey the spirit of wonder she feels wheeling through the world. Includes thrilling footage of an underwater wheelchair that lets her explore ocean beds, drifting through schools of fish, floating free in 360 degrees. 2. Todd Kuiken: A prosthetic arm that "feels"
Physiatrist and engineer Todd Kuiken is building a prosthetic arm that connects with the human nervous system -- improving motion, control and even feeling. Onstage, patient Amanda Kitts helps demonstrate this next-gen robotic arm.
The thesaurus might equate "disabled" with synonyms like "useless" and "mutilated," but ground-breaking runner Aimee Mullins is out to redefine the word. Defying these associations, she shows how adversity -- in her case, being born without shinbones -- actually opens the door for human potential.
At TED's Full Spectrum Auditions, comedian Joshua Walters, who's bipolar, walks the line between mental illness and mental "skillness." In this funny, thought-provoking talk, he asks: What's the right balance between medicating craziness away and riding the manic edge of creativity and drive?
A phone company is exploring ways to bring mind control to its mobile devices in hopes of allowing people with mobility impairments to communicate and function more easily in modern society. But the ultimate goal of the brain-controlled computer project is to broaden the ways in which all people can interact with devices, researchers in the Samsung’s Emerging Technology Lab told MIT Technology Review.
The Samsung researchers are testing how people can use their thoughts to open an application, communicate a message, select a song from a playlist, or turn on or off a Samsung Galaxy Note 10.1. The researchers are working on the new brain-controlled technology in ollaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas. The early-stage research, which utilizes a plastic cap covered with EEG-monitoring electrodes and a tablet device, shows how a brain-computer interface could help someone with mobility issues complete tasks that otherwise could not be done.
In using EEG-detected brain signals to control the interface, the researchers monitored typical brain activity patterns that occur when people are shown repetitive visual patterns.
The Samsung and UT Dallas researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency. Discovering new ways to interact with mobile devices has been a driving force behind the project, Insoo Kim, Samsung’s lead researcher, told Technology Review.
New technology that lets users control game avatars and music playlists with their brainwaves could give stroke patients and the profoundly disabled new ways to communicate.
A small but growing industry of inventors, neurologists, and investors are betting on consumers controlling smartphones, music players, and even desktop computers with their brains. Innovations in software development kits (SDKs) alongside cheap, ever more sophisticated brainwave readers mean people with money to spend can play computer games through thought alone. But the products they are using--and the patents behind them--could change the world for the neurologically impaired in a decade or two.
Last month at SXSW, Canadian neuroscientist and artist Ariel Garten showed off her commercial brainchild. The Muse is a $200 sensor-enabled headband which connects with PCs and Macs, and allows user to control games with their thoughts or engage in rudimentary neurofeedback. Garten spoke about the Muse and her company, Interaxon, in late 2012 at a TEDx talk in Toronto which went viral thanks to a discussion of the technology the Muse could lead to. Headbands are expected
to ship to customers in late 2013.
Using the Muse was an interesting experience. I had the opportunity to test a prototype out, and the headband slipped on easily--no sterile environment or special electrode setup was required. The headband was accompanied by a number of games and apps, all of which turn brainwaves into data input through embedded electroencephalograph (EEG) sensors. Although the games were dead simple, they were controlled by my thoughts. I was able to manipulate my avatar's motions on screen by thinking happy, sad, or anxious thoughts. Whenever I tried to throw the interface a curveball, it appeared to decently react to whatever line of thought or emotion I was engaged in.
Garten told Fast Company that she first began experimenting with brain-computer interfaces in 2003. Along with InteraXon co-founder Chris Aimone, she created public art installations where people's brainwaves could change the art. “We started by creating concerts where 48 people at a time could control a musician's output, which would then effect people's brain state when they heard it, in a regenerative cycle. We went on to create more musical performances, where musicians could be jamming along to music directly with their brain, it was tons of fun,” Garten said. EEG-reading headbands aren't only used for consumer games either. Another product making the rounds at SXSW was the Zen Tunes app from Japanese firm Neurowear. Neurowear, who were featured in Co.Design a few years ago for their cosplay brain-powered cat ears (really), manufactured an integrated prototype headset and iOS app combo which generates playlists tailored to a user's brainwaves. Neurowear customers put on an EEG-enabled headset and load songs from their music library onto a playlist. Once the songs are playing, algorithms within the Zen Tunes app analyze brainwaves for EEG patterns associated with focus and relaxation. These patterns are then used to sort music into playlists that, ideally, will match user's specific moods.
Brain-computer interfaces (BCIs) have been around since the 1970s, when clunky EEG readers were used in laboratory settings for rudimentary neurofeedback and biofeedback programs. Although the readings and data inputs from EEG readers have not changed significantly over the past forty-odd years, the equipment used has changed significantly. Instead of requiring a university laboratory or a quiet room without sounds from the outside causing false positives, and instead of requiring nurses or lab technicians to assist with setup, they have become consumer technology. The Muse headband, Neurowear's floppy animal ears, and competing products from firms like Axio are easy-to-use diversions for anyone with a few hundred dollars to burn. Today's brain-reading headbands require no medical training to use, have a tiny learning curve, and frankly are a ton of fun to use.
As Garten put it, “The main concept in brain-machine interfaces is that changes in your brain are reflected in changes in some signal, in our case EEG, which can then be used as a kind of control action to a machine, without the need of using any physical action or command. Your brainwaves (EEG) are small electrical potentials on your scalp as a result of neurons firing in your brain, and Muse's four electrodes record this fluctuating voltage several hundred times per second. These voltages are converted to digital signals and a stream of numbers is sent to your PC or Mac via Bluetooth.” The SDK lets users turn these EEGs into data, which can then control program avatars. Alternately, developers could use the SDK to write neurofeedback software which lets users view their brain behavior and trace patterns related to hyperactivity or anxiety. According to Garten, Muse's SDK will also provide some preliminary analysis tools that let you extract more meaningful interpretations of the data, such as the power of "alpha" or "beta" frequencies, and use that as control signals to various devices.
However, the real future potential for brain-computer interfaces is in healthcare. When I spoke to Garten, she was outgoing about everything but potential repurposing of Muse's technology for clinical applications. There's a reason for that. Brain-computer interfaces, and higher-end versions of the sensors used in consumer headbands like Muse have world-changing ramifications for traumatic brain injury patients, stroke victims, and individuals with physical disabilities.
Kamran Fallahpour is a psychologist at New York's Brain Resource Center who has used brain-computer interfaces in the workplace for more than 20 years. The Brain Resource Center takes advantage of brain mapping and mind-computer interfaces for patients with everything from mood disorders to traumatic brain injuries. Other patients are professional musicians or actors seeking brain mapping in the course of peak performance training. When these patients visit their office, they essentially use a more complicated version of Neurowear and Muse's software input kits. According to Fallahpour, the big innovation in brain-computer interfaces is the ever-increasing capabilities of computers. Even an iPhone has sheer processing power to parse through data points that a 1990s-vintage 486 could not. The human mind is immensely complicated and neurologists understand very little of it. Nonetheless, even the information that brain-computer interfaces transform into bits and bytes overwhelmed past computers. Advances in technology mean it's possible to now have basic home mind-reading headbands for your smartphone or laptop--something that was science fiction until quite recently.
But while it's amazingly fascinating to control avatars in Angry Birds-type games with thoughts, it's still all fun and games. According to Fallahpour, most consumer EEG headbands and brain-computer interfaces are “toys” that lack the capabilities of research and clinical-grade systems. The inexpensive sensors used create a large number of artifacts, are very vague, can be effected by physical movement, and only read basic emotions like stress and relaxation. The more sophisticated versions of these commercial brain-computer interfaces are now being used in hundreds of laboratories nationwide in neurofeedback projects that treat post-traumatic stress disorder, hyperactivity, and a host of other conditions.
Other brain-computer interfaces, however, are far more sophisticated. Back in 2010, Fast Company reported on an early project to type into computers using brain-computer interfaces. Since then, they have gotten even more complicated—and can change the world for disabled patients. Researchers at Drexel University College of Medicine in Pennsylvania are currently studying brain-computer interfaces for ALS patients. Using laboratory-quality EEG headsets, scientists hope to see if individuals with ALS with “extreme loss of neuromuscular control and severe communication impairments” can make selections on a computer screen with a brain. In similar projects, patients were able to type short text messages using only their brain waves.
Drexel is currently recruiting participants with ALS in the Philadelphia metropolitan area for the study.
NEW YORK (AP) — As her mother and father edged toward dementia, Nancy D'Auria kept a piece of paper in her wallet listing their medications.
It had the dosages, the time of day each should be taken and a check mark when her folks, who live 10 miles away, assured her the pills had been swallowed.
"I work full time so it was very challenging," said D'Auria, 63, of West Nyack.
Now she has an app for that. With a tap or two on her iPhone, D'Auria can access a "pillbox" program that keeps it all organized for her and other relatives who share in the caregiving and subscribe to the app.
"I love the feature that others can see this," D'Auria said. "I'm usually the one who takes care of this, but if I get stuck, they're all up to date."
From GPS devices and computer programs that help relatives track a wandering Alzheimer's patient to iPad apps that help an autistic child communicate, a growing number of tools for the smartphone, the tablet and the laptop are catering to beleaguered caregivers. With the baby boom generation getting older, the market for such technology is expected to increase.
The pillbox program is just one feature of a $3.99 app called Balance that was launched last month by the National Alzheimer Center, a division of the Hebrew Home at Riverdale in the Bronx. "We thought there would be an opportunity here to reach caregivers in a different way," said David Pomerantz, executive vice president of the Hebrew Home. "It would be a way to reach people the way people like to be reached now, on their phone."
The app also includes sections for caregiving tips, notes for the doctor and the patient's appointments, plus a "learning section" with articles on aspects of Alzheimer's and an RSS feed for news about the disease.
Trackers are also important tools for Alzheimer's caregivers.
Laura Jones of Lighthouse Point, Fla., says she was able to extend her husband's independence for a year and a half by using a program called Comfort Zone.
"He was just 50 when he was diagnosed," she said.
Jones said she went to work so he would continue to get insurance coverage.
"Day care was not appropriate, home care was not affordable," she said. "Even when he stopped driving, he would ride his bike all over town, to the gym, for coffee, errands. He would take the dog for a walk and be out and about when he was alone and I was working."
Using Comfort Zone, which is offered by the Alzheimer's Association starting at $43 a month, she was able to go online and track exactly where he was and where he had been.
Her husband carried a GPS device, which sent a signal every five minutes. If Jones checked online every hour, she would see 12 points on a map revealing her husband's travels. She would also get an alert if he left a designated area.
Eventually, the tracking revealed that Jones' husband was getting lost.
"He would make a big funny loop off the usual route and we knew it was time to start locking down on him," she said.
Mended Hearts, an organization of heart patients and their caregivers, is about to start a program to reach caregivers by texting tips to their phones.
"We hope this will be the beginning of several patient- and caregiver-based texting programs that reach people where they are," said executive director Karen Caruth.
Lisa Goring, vice president of Autism Speaks, said tablets have been a boon to families with autistic children. The organization has given iPads to 850 low-income families. And the Autism Speaks website lists hundreds of programs — from Angry Birds to Autism Language Learning — that families have found useful.
Samantha Boyd of McConnellstown, Pa., said her 8-year-old autistic son gets very excited when the iPad is brought out.
"There's no way he'd be able to use a keyboard and mouse," she said. "But with the iPad, we use the read-aloud books, the songs, the flash card apps."
She said the repetitiveness and visuals help. "He catches a word and repeats it back. He says the name of a picture, and the iPad says it back."
Boyd said the iPad also works as a reward: "He likes to watch Netflix on it."
One of the most popular online tools for caregivers is one of the oldest: the message board, available all over the Internet and heavily used by caregivers of dementia and autism patients, who perhaps can't find the time for conventional support groups.
"It's a place for families to talk about the strengths and the accomplishments of their child with autism but also talk about some of the challenges and be able to find the support of other families," Goring said.
Some tools are not specific to a particular disease or condition.
CareFamily, which prescreens in-home caregivers and matches them to customers over the Internet, has online tools that let a family remotely monitor a caregiver's attendance, provide reminders about medications and appointments, and exchange care plans and notes via email, texting or phone. "We're in the infancy of what technology can do for caregiving and it's only going to grow," said Beth Kallmyer, a vice president at the Alzheimer's Association.
But she cautioned that it's too soon to depend entirely on online tools.
"It's not a good fit for everybody," she said. "When you're looking at people impacted by Alzheimer's disease, including some caregivers, you're looking at an older population that might not be comfortable. We always have to remember technology is great — when it works."
Spinal cord injury, stroke, and hundreds of congenital and acquired disorders impair the use of hands--an essential body part for using touch-screen technology. A handful of apps are switch-accessible, but these consist mainly of AAC apps and some early childhood books and games (Jane Farrell keeps a list here). For all other apps, these users are out of luck for now. However, there is at least one app that shows potential for readers.
MagicReader is a free, ad-supported app for iPad released by the Japanese developer GimmiQ about a year ago. The app uses the iPad's camera to recognize a face, and then track head movement, allowing users to turn the pages of books. The app currently only supports PDF files and compressed comic book files (there are several comics available free in-app), but the developer promises to support more formats soon. After importing a PDF through iTunes or email, you need to find the right distance and lighting to optimize the facial recognition. Once the app reliably finds your face, it is fairly simple to turn the pages forward and back with a turn of the head, even while wearing clear glasses. Two blue stars at the top light up when the app has found your face, letting you know you can turn your head to turn the page. Looking upwards navigates in and out of the library.
The uses for disability populations is currently limited in that the app requires a 45 degree turn of the head rather than tracking only eye movements. Of course users will still need assistance in opening the app unless they have a more sophisticated set-up. When I first used the app, it took some time to find just the right distance, head turn speed, and lighting conditions for reliable turning, and sometimes the pages flipped when I wasn't ready or just looked up from the tablet. I tend to read a lot of PDF files, but most people read e-books, which are not yet supported.
When the iPad is mounted on a wheelchair or supported on a stand, this app could be of great use to many people. For stroke survivors who can hold the device in one hand, they can now use their heads to turn the page instead of setting it down to touch the screen. The description recommends the app for those reading recipes while cooking, musicians turning sheet music, parents reading while holding babies, and even people reading while eating. This app may be useful in your practice now, but more than that, I think it shows the potential for alternative means of accessing tablet technology. Given that the app's FAQ states a paid version is coming, it's probably worthwhile to download MagicReader now while it's free.
BodyWave is a brain wave monitor that attaches to the body much like an MP3 player. There’s no need to wear a silly looking headset. Wearing BodyWave, you can discreetly increase your mental capacity, physical performance, or control objects in your surroundings without anyone ever knowing. And better yet, it’s all by mind alone! In the near future, BodyWave will speak directly to you without the need for a cell phone or computer. Standing over your golf ball ready to putt? BodyWave will tell you when you’ve reached a peak performance state and are ready. Want to make that online trade, BodyWave will say, “It’s now time.”
How it works
Three dry sensors on the back of the unit contact the skin and begin searching for brainwave activity. The brain wave patterns are transmitted via Bluetooth or WiFi to a computer or hand held device. Brainwave activity can control computer activities, 3D training scenarios, or live field equipment. Currently we have algorithms that monitor drowsiness, attention, mediation, stress/anxiety, and peak performance. This is just the beginning! We’re developing apps to turn your cell phone on or off by mind alone. External control of home electronics is just around the corner as well — all from your brain and off the wrist! We also think that 3D control on X, Y, and Z axises are just around the corner too. Think: forward, back, left, right, up, and down. Think it and it’s done.
Training environments
Virtually any 3D training scenario can be governed by brain wave activity. BodyWave makes learning to relax, to pay attention, to reduce stress, or to reduce anxiety easy and simple.
Peak performance state for athletes, done.
Reduce stress for flight controllers, done.
Optimize attention in students, done.
Optimize performance for law enforcement, medical, or other high stress jobs, done.
Social media is now a very important part of today’s society. Aside from bringing light to the business sector, this digital platform, social media, has changed the landscape of health care as well. In fact, many therapists agree that Facebook, Twitter, YouTube, LinkedIn, Pinterest, Flickr and many other social media sites did a great deal in helping them grow professionally. So how exactly does social media empower therapists?
It strengthens therapist-patient relationship.
Communication between patient and health care team has evolved for the better, thanks to social media sites. At sites such as Patients Like Me, patients are able to share their medical condition to other patients and health care professionals around the world, thus allowing them to compare treatment procedures, discuss test results and learn from each other. Instead of reading medical articles and reference blogs, patients can directly ask the assigned therapy professional and get answers to their queries real-time. This procedure keeps medical information transparent, true and fast.
It shares industry breakthroughs.
Nowadays, therapists no longer rely solely on magazines and journals for learning. Instead, they turn to social media sites to gain insight on the health care industry’s new and upcoming trends. Following professional sites tells therapists what’s new in the medical sector and offers them reliable information. Following their favorite bloggers can also give them an insight on what’s happening in the health care profession.
It spreads awareness about public health issues.
Social media becomes an important tool especially during emergencies such as Hurricane Sandy. By retweeting posts on Twitter or posting links on Facebook, therapists are able to inform and promote the safety and well-being of individuals in the community. Therapists can also raise awareness for a certain cause through tweets, blogs and posts.
It makes communication easier, especially for travel therapists.
Before social media, travel therapists would have to send out or receive tons of letters for their assignments. Some instructions could not reach them while they were on the road. Now with social media, travel therapists only need to follow a Twitter feed, read a blog post, or check Flickr photos for instructions. Moreover, communicating to friends and family back home has become easier, thanks to social networking sites. Photo and video sharing to loved ones is also possible through Facebook and Flickr.
It shares knowledge, expertise, and support towards fellow healthcare professionals.
Sites such as PutMeBackTogether are avenues for sharing knowledge and expertise with other therapists. All one needs to do is to register an account and he or she is free to participate in forums and discussion boards, ask and answer questions, read related articles, and even get therapy job opportunities.
It helps therapists de-stress.
YouTube, Metacafe and other video sites can help therapists de-stress. Numerous videos are light and funny -- the perfect medicine after a tiring 12-hour shift. Social networking sites can also serve as the temporary escape from the daily grind.
Do you know other ways how social media can empower therapists and other health care professionals? Share them here!
About the author: Based in San Diego, California, Melissa Page is a social media contributor who has been writing about the importance of social media on various blogs. She currently works with My Life, a revolutionary platform that allows you to manage your social network updates and email messages securely and conveniently on one dashboard. When she’s not busy writing, she’s out with her friends.
Scott Mackler attends a New York Mets game with his son, Noah, and friend, Jen.
Scott Mackler is trapped.
It’s a Wednesday afternoon, and Scott is putting the final touches on a grant proposal — the periodic, cymbal-like sputtering of his ventilator cutting through the silence of an otherwise empty room. Sitting back about 45 degrees in a wheelchair with his hands folded neatly across his lap, Scott is where he has worked for the past decade: in his ground floor lab in the Perelman School of Medicine. I’m joking around with Scott’s nurse for the day, Adam Czerwinski, when we realize that Scott is trying to tell us something.
Adam leans up against Scott’s electric chair, his elbows pressing into the machine’s soft padding. His eyes no more than a foot away from Scott’s, he begins to rattle off a series of questions. “First, second or third? Is it first?”
Scott’s eyes shift slowly toward us — a sign that we have the right answer. “It is first.”
Then, we begin to go through the first third of the alphabet.
“A, B, C, D, E, F,” Adam says in a low, methodical voice.
Once again, Scott’s eyes move in our direction. We’ve arrived at the correct letter: “F.” Based on the fact that it’s noon — right around lunchtime for Scott — we’re able to deduce fairly quickly that he’s trying to spell out “food.”
Coming up with the word takes us about 13 seconds.
Scott hasn’t been able to speak for 13 years.
***
Scott, a 54-year-old professor in the Medical School, was diagnosed with Amyotrophic Lateral Sclerosis — more commonly known as Lou Gehrig’s disease — in 1999.
While some ALS patients retain basic movement or speech functions as the disease progresses, Scott’s wife, Lynn Snyder-Mackler, describes her husband’s state as a “worst-case scenario.” Scott is essentially “locked in” — his mind is as brilliant as ever, but he’s unable to convey any of that brilliance to his friends or family by himself.
Other than his slight eye movements and an occasional ability to crack a smile, Scott can’t move a single part of his body, forever confined to his chair unless helped by somebody else. He can’t breathe without a ventilator, he can’t eat without a feeding tube and he can’t speak without one of his nurses running through the alphabet with him — a tiring, painstaking task to complete dozens of times every day.
Scott is among 30,000 people nationwide who have ALS, according to the ALS Association. There is no known cure for the disease.
“His mind is so superior to all of ours,” says his older brother, 1975 Wharton graduate Harvey Mackler. “His mind knows exactly what he wants to say, but he’s trapped — a prisoner inside his own body. If that was me, I would’ve checked out a long time ago.”
While the vast majority of ALS patients don’t live for more than five years after their diagnosis, giving up on life is the furthest thing from Scott’s mind.
From Monday to Thursday every week, Scott still makes the hour-long trip from his Newark, Del., home to Penn’s campus, continuing to direct his lab on the ground floor of the John Morgan Building. Although he’s no longer able to lecture or work with patients, he’s still on the cutting edge of drug addiction research, one eye movement at a time.
Recently, his team found that NAC-1 — a protein in the brain that Scott discovered in 1996 as part of his addiction studies — may play a role in the cause of ALS.
He continues to research NAC-1 today. I ask him over email why, despite all of the challenges, he keeps on going, day after day.
“I have no desire to slow down,” he replies, dictating his responses to Lynn through his usual eye-shift method. “I am a scientist. I have so many ideas and want to see this work continue and even explore new areas. I have fulfilling work.”
***
We’re sitting in Scott’s office again, just finishing up lunch.
Scott is wearing a blue-striped shirt that is neatly tucked into a pair of black pants. His hair has all but finished the graying process, a sign that he’s slowly approaching 60.
A dense white cloth supports the back of his head on the chair, while a long blue tube is protruding from his neck, expanding in and out with every breath he takes.
Adam and I are playing around with Scott, poking fun at his favorite baseball team: the New York Mets.
Scott smiles.
It’s little more than a faint curl of the end of his lips, but it’s a powerful moment — a true register of emotion without Scott having to worry about moving his eyes.
We’re getting ready to take a walk outside, but Scott wants to send one last email before we go. Like he’s done thousands of times before, Adam begins his routine to determine what Scott is trying to say.
He starts by asking who the email should be sent to.
“First, second or third? Is it third?”
To communicate, Scott divides the alphabet into thirds: A-H, I-Q and R-Z. Once you’ve landed on the correct third — which you’re able to determine once he looks in your direction — you begin to go through each individual letter one by one, until he looks at you again.
Guessing a word based on context can sometimes speed up the process, but Scott has made it clear over the years that too much guessing bothers him.
The process is far from foolproof. Sometimes, Scott’s eyes — barely even slits — will “freeze up,” making it exceedingly difficult for him to look in one direction or the other. Other times, you’ll come up with an entire string of words, only to discover that you have to start over after Scott spells out “wrong.”
But it’s the best system Scott has.
It takes Adam about 10 minutes to get three letters from Scott — S, E and T — when I suddenly realize whose name he might be trying to spell.
“Is it Seth?” I ask, somewhat amazed and embarrassed by the prospect that it could have taken me this long to recognize my own name.
Scott shifts his eyes to look straight at me — a clear yes.
It turns out that Scott had wanted to send me a copy of some PowerPoint slides I’d requested from him a day earlier.
If it wasn’t for Scott, I probably would’ve forgotten about the slides.
“I forget things all the time, but Scott, what an incredible memory,” Adam says. “To do what he still does in his condition has given me a whole new outlook on life. I think it gives us all a new outlook.”
***
Scott first began to suspect that he had ALS near the end of 1998, when one day he was playing tennis and had trouble maintaining a firm grip on his racket.
Soon after, he began slurring his speech, finding it increasingly difficult to form sentences in the way he wanted.
His inability to control bouts of laughter, his involuntary muscle contractions — they were all signs of the onset of ALS.
In his younger days, Scott was the definition of active.
A man who could once run a mile at a 4:47 clip, Scott played on Penn’s soccer team during his first few years as an undergraduate at the University in the 1970s. When his two sons, Alexander and Noah — both Penn graduates as well — were growing up, he coached their soccer teams, never missing a weekend game.
“He was their super dad,” Lynn says. Although Scott was formally diagnosed midway through 1999, he and Lynn had been almost certain for some time that he had ALS.
“ALS is a bad diagnosis, so doctors avoid it,” says Lynn, who is a professor in the University of Delaware’s Department of Physical Therapy. “It’s a diagnosis of exclusion.”
After the diagnosis became real, Lynn and Scott faced one of their toughest challenges yet: telling their friends and family.
“I don’t think any of us will ever forget the email that broke the news,” says Julie Blendy, a professor of pharmacology at Penn and a friend of Scott’s. “I remember it was long, and I remember having to read it at least three or four times for it to really sink in. It was heartbreaking.”
A few days earlier, Julie remembers, Scott had missed the cocktail hour of a reception at which he had been scheduled to give a speech. When he arrived an hour later, his words were a bit slurred — an odd speech pattern for somebody Julie describes as “extremely articulate.”
Nearly everybody in Scott’s life says they remember exactly how they reacted when they first learned of Scott’s diagnosis.
He told his two sons on a family vacation to Chile in May 1999.
The family had just finished hiking up a mountain outside the city of Pucon — “Dad kicked our asses as usual,” says Alexander, a 2005 College graduate — when Scott sat down to tell them.
The next day, while walking down from a volcano, Scott slipped and fell, tearing his meniscus. He hobbled down the rest of the volcano on his own, limping his way through the remainder of the week in Chile.
After that, Scott’s condition deteriorated rapidly. Within a year, he was already confined to his chair, unable to move most of his body.
“He was able to make it through that first year, and after that I knew it was a sign of how tough he was,” Alexander says. “We knew then that he’d fight through this.”
***
“Please don’t think I’m an inspiration, because anyone could have done what I’ve done.”
Scott is “speaking” in an automated voice — which he recorded through a special dictation program he uses — to a group of first-year medical students at Penn.
It’s early April, and Scott is giving his annual talk to the students.
As Lynn stands beside her husband — like she’s done resolutely since the two met at Penn in 1979, a year before Scott earned his undergraduate degree — about 40 students look on in awe as Scott begins to tell his story.
Scott uses the talk not just as an opportunity to discuss ALS, but also to impart some practical advice on the students.
“I just enjoy the experience,” Scott says of why he began giving the lecture years ago. “In the first year, med students learn a lot about science and little about actually working with patients. I am their worst nightmare in some respects. This lecture makes a terrible situation — facing terminal illness and chronic disability — approachable and not so scary. Humor helps.”
Humor has been a powerful tool for Scott, both in his lecture and in everyday life.
During one part of the lecture, Scott speaks of how nothing bothers him more than when he meets somebody who equates his physical condition with his intellect.
On the screen above him, he flashes a friendly message addressing those people: “I have a note from my mother allowing me to tell you to FUCKOFF. If I need help, I will ask.”
Over the years, Scott has traveled to several other schools to give his lecture.
Scott maintains that he’s not an inspiration, but few who have met him would agree.
“It’s tough to put into words what’s he’s done,” says College freshman Allison Jegla, who is from Michigan. “He’s a hero to me.”
In November 2008, Allison was watching “60 Minutes” on CBS when she saw a segment on Scott. She was awestruck by Scott’s outlook on life and decided to write him an email.
Soon after, Scott wrote her back, inviting her to an upcoming lecture at the University of Michigan. At the time, Allison wanted to be an engineer.
Today, she wants to be a doctor.
She says she found her way to Penn because of Scott.
“He’s essentially a complete stranger to me, and yet his story led me to this place and career,” she says. “It’s 100 percent because of Scott Mackler that I’m here.”
***
You could say that Scott’s story is one that’s made up of a collection of “what ifs.” What if he could somehow speak on his own again? What if he could move just one arm, or a hand or a finger?
Scott’s family, though, would take issue with those “what ifs.”
“You deal with reality when these are the cards you’ve been dealt,” Harvey says. “To us, this is the norm.”
In their eyes, Scott’s story is more like a tale of “wows.”
“His will to live, to continue to be productive in his work and make people understand the nature of this disease — this man is incredible,” says 1980 Wharton graduate Ron Perilstein, one of Scott’s longtime friends.
While Perilstein acknowledges that his relationship with Scott has changed since the two played soccer and were brothers at Pi Kappa Alpha together as undergraduates, much between them remains the same.
“Hey, asshole, how are you?” Ron will say to his old friend whenever the two get together.
While Scott is no longer able to go anywhere on his own, there’s rarely a dull moment in his life today.
During the day, he’s almost always with either Adam or his other nursing assistant, Dana Williams. At his Newark home, he has an “army” of physical therapy students — all of whom are in Lynn’s program at the University of Delaware — at his disposal.
The students help to feed Scott, bathe him, get him ready for work — all basic functions that he’s no longer able to do on his own.
“We have insurance, good friends, good employers and a great network of folks,” Lynn says. “We are really lucky.”
***
It’s a warm spring afternoon as Scott, Adam and I are returning to the lab after a stroll on Locust Walk.
We arrive back to his office, where the dozens of photographs lining the walls serve as a constant reminder of one thing: Scott’s family. “Our parents instilled in us that family is number one,” says 1977 College graduate Randi Mackler Windheim, Scott’s older sister. “He’s not fighting this alone.”
Most of Scott’s fondest memories from the past decade came when he was with family.
When Alexander got married last year, Scott had a front row seat.
When Alexander and Noah earned their undergraduate degrees in 2005 and 2007, Scott was lined up on Locust Walk like any other professor, dressed in his full commencement robes.
It was one of his proudest moments as a father.
Every year, many of Scott’s family members come out to Newark to participate in the Scott Mackler 5K Run/Walk, which has been held annually since 2000. Through the race — as well as through the Scott A. Mackler, M.D., Ph.D. Assistive Technology Program — the family has raised more than $1 million to assure that no ALS patient in the Philadelphia area goes without some form of communicative technology.
For his part, in addition to communicating through his traditional eyeball-shifting mechanism, Scott has the option of using a device called a Brain Computer Interface, or BCI.
Although Scott rarely uses the BCI today because it is slow and cumbersome to put on and off, the device essentially operates as a mind reader.
To begin using the BCI, Scott puts on a rubber blue cap with rainbow-colored wires coming out of it. The cap is fitted with electrodes that pick up Scott’s brainwaves.
Once the BCI is turned on, a monitor in front of Scott begins to flash, with different letters of the alphabet becoming illuminated for split seconds apiece. When Scott sees the letter he wants to select, he concentrates on it in his mind. The BCI ultimately registers what letter Scott is thinking about and displays it on the screen.
Communicating with the BCI can be frustrating — it may take Scott six months to write a grant proposal that, years ago, he would have finished in a few weeks — but he says the technology is remarkable nonetheless.
One day, as Adam, Dana and I are trying — and failing — to get the BCI running so that I can see how it works, we notice that Scott is trying to tell us something.
“E, T, H,” he begins to spell out.
He’s trying to say “Ethernet.”
We’d forgotten to plug in the internet.
In a room with three able-bodied people, it hits me at that moment: the smartest person in the room is the one sitting in the chair, unable to utter a word.
*** Scott Mackler is free.
We’ve just finished typing an email to his son, Noah, when Scott indicates that he’s ready to go home for the day.
Tomorrow — and for days after that — Scott will get in a van at around 8:30 a.m., which will take him to the place he’s long considered his second home: Penn.
At one point in time, though, even Scott didn’t think he’d be where he is today.
Soon after his diagnosis, Scott recorded a video to his sons, assuming he would never have the chance to see them grow into adults. “I know the future holds lots of love and joy and pride and that life goes on,” he says to the camera as he begins to run off into the distance. “I’ll be watching you along the way, and I love you very much, and I’ll see you.”
I ask Scott one final question: What do you miss most about your life before ALS?
His answer is simple and not unexpected. “Speaking and kissing my family.”
And with that, we turn our attention back to another run through the alphabet, putting Scott’s thoughts into words — one letter at a time.
Today’s smartphones and computers offer gestural interfaces where information arrives at users’ fingertips with a swipe of a hand. Still, researchers have found that most technology falls short in making people feel as if they’re interacting with virtual objects the same way they would with real objects.
But a team at UW-Madison says it has developed, for the first time, a way to move virtual objects in an immersive virtual reality environment through the use of muscle activity. In addition to making virtual reality more interactive and realistic, the research could have rehabilitation applications for people recovering from injuries or people living with specific disabilities.
Radwin
“We’re trying to add the dimension of movement and touch to allow people to exert forces against things that are created in front of them with a projector and virtual reality goggles,” says Robert Radwin, a UW–Madison professor of industrial and systems engineering, biomedical engineering and discovery fellow at the Wisconsin Institute for Discovery (WID)’s Living Environments Laboratory (LEL). “What if we could use these virtual exertions as a way of rehabilitating people from an illness or an injury such as a stroke? What if we could alter people’s abilities for different tasks, making them weaker or stronger during certain exercises?”
In previous research, manipulating virtual objects has relied on wands, controllers and other devices external to the body. Though valuable, these devices are not driven by a person’s muscles the same way as when real objects are picked up or moved.
Radwin worked closely with the LEL to develop the software and process for a pilot study in which participants move virtual objects in the CAVE, a fully immersive six-sided room that projects 3-D environments on its walls. Kevin Ponto, an assistant professor of design studies in the School of Human Ecology, Karen Chen, a graduate student in the LEL, and Ross Tredinnick, systems programmer in the LEL, collaborated in the design of the project.
In the study, participants’ arms were hooked up to an electromyography (EMG) device that collects the electrical signals produced by muscles during physical activity. Situated outside of the CAVE, people lifted dumbbells of different weights while the EMG device recorded muscle activity to a nearby computer.
“What if we could use these virtual exertions as a way of rehabilitating people from an illness or an injury such as a stroke? ”
Then, participants did the same exercise inside the CAVE, wearing 3-D goggles, head and hand sensors, and the EMG device. Instead of lifting a real dumbbell, people “grabbed” and lifted a virtual dumbbell instead, stiffening their arms to lift the object.
Karen Chen says the preliminary results show that people can adapt their lifting behavior to a virtual reality environment using the same muscle groups used to lift real objects. The benefit, she adds, is being able to perform lifting in a controlled environment with a reduced risk of strain or dropping real objects.
With more experiments and refinement of the technology, the group may examine downsizing the technology to include a headset rather than an entire virtual reality room.
Ponto says that in addition to rehabilitation applications, the project can open doors to make virtual reality more intuitive, including creating the illusion of lifting objects that don’t actually exist.
Future research could examine how to better tie virtual reality and human muscle groups together so simulations are more personally tailored based on a person’s normal activity while sitting or standing. —Marianne English
"A Lady in Red Brings Tender Care" The following article on Kathryn Voit is from Bayada’s Website:CARE Connection Vol1 Number1.Jan 2012; by Caroline Graham. Sadly Kathryn lost her courageous battle with ALS on March 8th, 2012. She was loved and admired by all of us at The ALS Association Greater Philadelphia Chapter, and she will be deeply missed. How care at home is helping Kathryn live her life to the fullest, despite ALS
Kathryn V. celebrated her 75 h birthday in a house filled with laughter and love, courtesy of her husband, Gerry, 80, the couple‘s four children, and 10 grandchildren. ―We thought about going out to a restaurant, but it‘s getting harder for Kathryn to chew and swallow, so we specially prepare her food for her,‖ says Gerry. ―It‘s much easier to stay home.‖
Diagnosed with amyotrophic lateral sclerosis (ALS) in 1998, Kathryn has defied the odds, as her disease has progressed much slower than anticipated. ALS causes nerve cells to waste away or die, preventing them from sending messages to the muscles.
This eventually leads to muscle weakening, twitching, and an inability to move the arms, legs, and body. When the muscles in the chest area stop working, it becomes difficult or impossible to breathe on one‘s own.
―The symptoms came on gradually,‖ says Gerry. ―At first she simply couldn‘t walk as fast, then she started to trip while walking. We were in the airport in San Francisco on our way home from a vacation when she realized she could barely walk at all.‖
Kathryn, who worked for years as a math professor at numerous colleges and universities, did not let her diagnosis stop her from teaching. At first she used a cane, then, a wheelchair. But after three or four years it got to be too much, and she had to stop working.
The disease had progressed to the point where Gerry knew he couldn‘t handle her care on his own.
Gerry contacted BAYADA Home Health Care and found out that a home health aide (HHA) would be able to meet Kathryn‘s needs and improve her quality of life.
HHA Connie Smith was carefully matched with Kathryn and Gerry. Kathryn shares that she had always been concerned that having a home health aide would ruin her privacy, but with Connie‘s discreet presence, she didn‘t feel that way. ―We can be silent together or we can talk and laugh together. It‘s fun to have this sisterly contact that we can enjoy, from politics to clothes, to what‘s on the menu today."
―Connie and Kathryn are like peas in a pod," says Gerry, describing the friendship that has developed in the eight years since Connie began caring for his wife.
ALS has robbed Kathryn of the use of her legs and arms. What‘s more, her trunk muscles are too weak for her to sit up on her own, so her wheelchair needs to be in a tilted back position.
Connie has adapted to the differences in the level of care that she provides to Kathryn with compassion, patience, and skill. ―As I become more and more disabled with ALS, Connie has become my hands," says Kathryn. During a typical day, she will bathe and dress Kathryn, wash and style her hair, feed her, and help with toileting. Connie also takes her clothes shopping or to ceramics class. And thanks to a special stand used to prop up books, Kathryn is able to continue her love of reading, while Connie helps by turning the pages.
Modern technology has also opened a world of possibilities for Kathryn. At first, voice activated software helped Kathryn stay connected. Then, when her voice became unrecognizable because of the disease, Gerry connected their computer to the flat screen TV. A reflective dot placed on Kathryn‘s nose interacts with a sensor, which, in turn, interacts with the computer.
Connie learned how to set up the equipment for Kathryn, who can then control the mouse with the movement of her head.
―We have always been able to adapt to each new challenge as her disease progresses," says Gerry.
Last year, as ALS took away her voice, Kathryn found that writing poetry became a way to communicate with the world. During her birthday weekend, Kathryn participated in a special poetry reading at her church. Kathryn wrote this poem about Connie and her care:
A lady in red Brings tender care, Fills my needs, Answering a prayer.
Married 51 years, Gerry describes his wife as a gutsy, determined lady who insists on living life as fully as possible. Fortunately, Connie is there by her side, helping her have the best quality of life, despite her diagnosis. In 2008, Connie‘s exceptional care and compassion for Kathryn earned her the distinction of being named BAYADA Home Health Aide Hero of the Year in front of thousands of employees at the company‘s annual Awards Weekend in Philadelphia.
Connie‘s husband, sister, father, mother, and of course, Gerry and Kathryn, were there to support her as she accepted the prestigious award.
The launch of the Samsung Galaxy S4 last month garnered the type of media attention we’re getting used to for any new smartphone. Among the most talked-about feature pre-launch was “eye tracking” – the phone’s ability to know where its user was looking and react accordingly. It took a few days for anyone to point out the application was more “head tracking” than eye tracking. As reported in the Wall Street Journal, it enables what Samsung is calling “smart pauses” and “smart scrolling”.
Face tracking and the Samsung Galaxy S4. In short, the “smart pauses” feature recognises if you are in front of the phone or not, and can save what you were last doing when you move away from the device and decide to come back to it. “Smart scrolling” refers to the phone’s inbuilt eye-tracking technology, which detects if the phone has been tilted and scrolls up or down accordingly.
An eye to eye-tracking
Eye tracking is a set of technologies and techniques for measuring where a person is looking (the point of gaze), for how long (fixation), and what are called saccades (fast movement of an eye). The use of eye tracking dates back to the 19th century, when it was used to study reading behaviour and which words people focused on and which words people skipped over and revisit using analogue, observation-based techniques.
Eye tracking demo. Since the 1980s, eye-tracking has been used in the field of human-computer interaction (HCI) for a myriad of purposes, including usability testing of interface design, and the development of computer navigation devices for disabled users. Samsung’s innovation is likely the first step in realising the full functionality enabled in more advanced eye-tracking technology systems, such as those produced by companies including Tobii and Mirametrix. Tobii, based in Sweden, is considered the industry leader in eye-tracking. The company develops hardware and software to analyse eye-tracking results, including goggle-based and traditional desktop systems. Mirametrix, headquartered in Montreal, is a relative newcomer, and has developed more affordable eye-tracking equipment.
The new Galaxy S4 was unveiled in New York on March 14. Andrew Gombert/EPA In terms of research and application, eye-tracking has been used in a wide range of disciplines, including vision research, psychology, cognitive linguistics, user experience testing and marketing. Eye-tracking was recently used by the Australian media company Fairfax, as part of its neuromarketing research, to analyse the “eye-gaze” of readers on the ads in the smaller tabloid-size format of the new Sydney Morning Herald and The Age. The company’s director of ad strategy was reported as saying the new format, from an advertising perspective, led to a “50% improvement in eye gaze”. Eye tracking is currently being used in spatial cognition, to help understand how people relate to maps and geo-location, and in virtual reality, as shown in the video below. In computer gaming, eye tracking opens up new possibilities in game immersion, where players can aim and turn through “gaze interaction”.
Last September, the International Society for Photogrammetry and Remote Sensing (ISPRS) ran a workshop in Melbourne that included a hands-on session with eye-tracking equipment. Participants were introduced to the Tobii eye-tracking system and participated in an experiment on understanding soil maps, maintained by the Victorian Department of Primary Industries via Victoria Resource Online. This is believed to be the first workshop of its kind. Similar workshops, in which researchers and conference participants will gain exposure to eye-tracking software for spatial research, have been planned for later this year.
Looking to the future
Eye-tracking technology is gaining momentum. From a commercial perspective, there’s great value in understanding where people’s attention falls and for how long, especially when designing technology products and marketing pixel-based real estate. In terms of research, there are many exciting opportunities for using eye tracking to understand visual processes and the ways in which people interact with information. No-one would claim the Samsung Galaxy S4’s move into this area is, in itself, a game-changer – but it’s something of a game starter, and a sign of things to come. Chris Pettit receives funding from the CRC-SI. He is affiliated with SSSI and ISPRS.
Please enter one or more recipients
(Use a comma to separate multiple e-mail addresses.)
Enter your e-mail address:
Please enter your email address.
Message to attach:
We use this address only to tell the recipient who sent the message. We do not save or re-use it in any way.
Send Email Cancel