Mobile technology is not only transforming how individuals interact in their consumers' experience, but is also gaining prominence in education. Mobile learning, commonly known as M-learning, is a relatively new tool to help in pedagogy. This is accomplished with the use of small, portable computing devices, including smart phones, personal digital assistants (PDAs), and similar handheld devices. M-learners typically view content or lessons or both in small, manageable formats that can be utilized when laptop or fixed station computers are unavailable. To date, their applications (apps) are used in educational, governmental, and industrial settings (McConatha, Paul, & Lynch, 2008). Increasingly, a common feature for mobile technology is the touch screen interface found in tablet PCs, PDAs, and smart phones (Kane, Bigham, & Wobbrock, 2008). Yet touch screen technology can present significant accessibility barriers for blind users. Many touch screens do not provide audio or tactile feedback, making it difficult or impossible to locate items on the screen. For this reason, if a device does not have screen reading technology, users with visual impairments may need sighted assistance to be shown the locations of on-screen objects. They may require an alternative accessible interface or, in worst cases, may be completely unable to use a device (Kane et al., 2008). While accessibility is one concern, understanding how to use the device is another as the trend is moving towards mobile technology. According to a report from the U.S. Bureau of Labor Statistics, consumer spending on all types of cell phones has soared while expenditure on landlines has declined (Kendall, Nino, & Stewart, 2010). One of the factors contributing to this phenomenon is the prevalence of smart phones. On top of the conventional features of a PDA--i.e., schedulers, contact management, and note-taking features--most smart phones have incorporated the functions of a personal computer with intuitive interface. Among the smart phones, the iPhone has emerged as one of the more popular smart phones. A feature that makes the iPhone popular to the user is the growing number of apps available through The App Store. Since the introduction of the iPhone in 2007, more than 3 billion apps have been either downloaded for free or sold via The App Store (Kendall et al., 2010). The App Store continues to grow at an unprecedented rate, with many institutions and businesses developing apps to offer customer convenience and to inspire loyalty. Further, there are many apps developed for teaching (Young, 2011). Among the many apps, a number are designed for people with visual impairments (AppAdvice, n.d.). Some are free of charge, while others require payment. Compared to the equivalent assistive technology products sold on the market, these apps are less expensive and emerge as competitive alternatives. The App Store has great potential for offering useful apps to the visually impaired community. The following are three examples: * Color ID: Developed by GreenGar Studios, this app helps people with visual impairment to identify colors. Some useful applications include independently selecting desired colored files or choosing the color of clothing. The app identifies and speaks the colors in real time. The app also announces the hex value of colors, allowing users to identify what the camera sees in color (Color Identifier, 2012). * Digit-Eyes: Developed by Digital Miracles for the visually impaired, this app scans barcodes of objects, which are usually in UPC and EAM format found on over 23 million products. Users are also able to create text or audio barcode labels that can be read back through the VoiceOver function or played back when scanned. For example, users can create expiry dates of products and paste them on containers or directly on the perishable item. By scanning the labels with the Digit-Eyes, information is read back to the user (Digit-eyes Audio Scanner, 2012). …
Read full abstract