EdTech medic
EDUCATIONAL TECHNOLOGY, ONE STEP AT A TIME
Humans are on the precipice of a technological evolution like nothing we have ever seen. Everything is changing at an exponential pace and we are at the “knee” of the exponential curve. All of the advances of the 19th and 20th century combined will pale in comparison to what we’ll see in the next 20 years. To put it simply, in the words of Marc Prensky, “everything is changing much faster than in the past” (Baptiste, 2014). Let’s look at where we are today to highlight some of the rapid changes. In 2015, renewable energy overtook fossil fuel for the first time in history (Hirtenstein, 2016). We’ve begun developing methods for scrubbing CO2 from the atmosphere and we can convert nuclear waste into diamond batteries that will last 5,000 to 10,000 years. Autonomous vehicles will soon dominate the roads and will be powered by clean renewable energy. Drones will become increasingly commonplace (video), reusable space rockets are and will continue to reduce the cost of space exploration, humans will soon colonize the moon and Mars, and on earth, close to 50% of all jobs in the next 20 years will be replaced with mechanization and robotics (Frey, & Osborne, 2013). Automation and artificial intelligence is making new advances almost weekly and these technologies have already begun to replace taxi drivers (Press, 2016), lawyers (Weller, 2016) and sports writers (Liberator, 2016). Everyday household items like mirrors and toilets will diagnose disease long before the patient has symptoms (Patkar, M. 2014; Ratner, 2016; Report, 2011). Doctors will consult increasingly with artificial intelligence to diagnose patients (Hernandez, 2014), robots will perform surgeries independently (Seaman, 2016), nanotechnology will circulate through the body eliminating cancer cells (Conger, 2016; Gao, & Yuan, 2014) and DNA editing will eliminate major diseases one by one (Maxmen, 2016). So what kind of changes can we expect in the future of learning in higher education? For one, it will become increasingly mobile, automated and unbundled from “fixed timing and courses of study to more competency-based approaches” (Slocum, 2015). Even today, “we can take out our smartphone and access all of human knowledge” (Ray Kurzweil). Textbooks and lectures are becoming less relevant as students have multiple sources of learning at their fingertips and can choose the learn from the best professors in the world instead of the ones that work nearby. Artificial intelligence is taking on the role of teaching assistants (TA), grading papers, answering student questions in a forum, reminding students of important dates over email and providing instructional correspondence - sometimes without the students even knowing that they’re being helped by a robot (Warschauer, & Grimes, 2008; Slocum, 2015). Three technologies I believe will revolutionize learning in the future
Artificial Intelligence (AI) AI is defined as “computer systems that are able to perform tasks that normally require human intelligence, such as visual perception, speech and image recognition, decision-making, and translation between languages.” Intelligent tutoring is one form AI system that currently exists and provides the student with immediate and customized instruction without the intervention of a human teacher. In one study, students learning with intelligent tutoring outperformed their classmates by an average 15% on standardized tests (Koedinger, & Aleven, 2015). Many of us are familiar with AI assistants like Apple’s Siri, Microsoft’s Cortana, Amazon’s Alexa and Google Assistant. These devices convert voice to text and can follow simple commands and answer simple questions. However, the future of AI will see natural language processing (NLP) and machine learning, the ability of AI to learn through trial and error, developed to the point where AI will become a truly mobile mentor capable of normal conversation. Until recently, deep learning (e.g. Google’s voice and image recognition) by computers has been about 10 times slower than humans at learning (Gohd, 2017). Google has been leading the field of deep learning, publishing 218 papers in 2016 and has created an AI that is almost as fast at learning as humans (Regalado, 2017). The future of AI, and what I hope to see as an educator, is AI assistive devices and programs that speak and understand the natural language, understand the student’s learning needs, and provide lessons that are response, adaptive and differentiated. Virtual and mixed reality learning A virtual reality learning environment (VRLE) is one in which a student is actively engaged within a virtual environment and encourages communication between the student and the teacher to problem solve. Like Second Life, a VRLE can provide synchronous learning in which a student’s avatar interacts with the teacher and other students. Students can form study groups, do group work and collaborate. What distinguishes a VRLE from other synchronous platforms like Blackboard Collaborate, is that students can be immersed in any imaginable environment, from the Pyramids of Giza to the operating room at a teaching hospital. Virtual worlds are powerful and immersive, especially through the lens of a head mounted display (HMD), and have been effective in improving learning outcome gains (Merchant et al., 2014). VRLEs allow students to share someone else’s sensory reality, and as such, is proving to be a remarkable medium for teaching empathy (Gerry, 2017), understanding mental illness (Gayer-Anderson, 2016) and is even used to provide social cognition training for those with autism (Kandalaft et al., 2012). Mixed reality blends the virtual with the real world, where holographic and physical objects co-exist. HMDs such as Microsoft Hololens project holograms, while Magic Leap’s HMD uses a photonics chip with digital light field signal that, in simple terms, projects images directly onto the retina. In both cases, these are stereoscopic 3-D images, and, speaking specifically about Magic Leap, look quite real according to the MIT Technology Review (Metz, 2016). Virtual and mixed reality technologies are still in the early stages of development. VR is sometimes described as an isolated experience since the user can’t see outside of the display. It also causes vertigo in some users. In contrast, Hololens allows the user to move about and to see both the holograms, the real world and interact with other learners. However, the field of view for these devices is still limited. In branding videos displaying the Hololens, you get the impression that images fill the entire field of view. In reality the field of view is limited to about 30%. To see the entire human structure as displayed in this video, you have to raise and lower your head instead of raising and lowering your eyes. This limits the immersive experience. With improvements to the field of view, holographic technology will eventually make learning a completely immersive experience, converting any room into an entirely different place and space, be it a the Natural History Museum, the Piazza Navona or the surface of Mars. Holographic technology continues to advance and the possibility of a completely immersive learning environment like the Star Trek Holodeck is a matter of when, not if. Haptic technology For learning to be truly robust and immersive, the objective is to make it experiential and that means utilizing as much of the senses as possible. Haptics is the science of applying tactile sensation to computer, or as I’ve discussed, holographic applications. It is a form of kinesthetic communication that recreates a sense of touch by stimulating receptors in the skin using vibration, force, pressure or resistance by using electric actuators, pneumatics and hydraulics (see gaming video). According to the Virtual Reality Society, “Modern gamepads are also imbued with haptic feedback. You may feel the thudding steps of a monster, the kick of a firearm or the rumble of an earthquake thanks to intelligent motors and weights in the device.” Bringing it all together No two students learn at the same pace or in the same way, and yet at this time it is challenging to meet the needs of individual learners. The merging of advance AI, holographic and haptic technology could, and hopefully will, usher in a new era of individualized competency based and adaptive learning. Learners will be able to converse with AI in their natural language and holograms and haptics will create the visual and tactile stimulation that will make learning truly experiential and contextual. AI will be the brains, the mentor, the coach, the motivator, the big data analyzer and enable the student to see, hear, speak and feel through the blending of all three technologies. What I envision is a truly mobile learning experience where AI is accessible through the learner’s smartphone and tethered wirelessly to a pair of computerized contact lenses and digitized haptic garments. This will usher in a new era of experiential learning. References
Baptiste, L., (2014). The future of learning: What today's stats mean for tomorrow's learners. (n.d.). Retrieved from https://www.marsdd.com/news-and-insights/future-learning-todays-stats-mean-tomorrows-learners/ Conger, K., (2016).How nanotechnology could detect and treat cancer. Retrieved from https://phys.org/news/2016-05-nanotechnology-cancer.html Frey, C. B., & Osborne, M. A. (2013). The future of employment: How susceptible are jobs to computerisation? Technological Forecasting and Social Change,114, 254-280. doi:10.1016/j.techfore.2016.08.019 Retrieved from: http://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf Gao, Y., & Yuan, Z. (2014). Nanotechnology for the detection and kill of circulating tumor cells. Nanoscale Research Letters, 9(1), 500. doi:10.1186/1556-276x-9-500 Gayer-Anderson, C. (2016). The application of virtual reality technology to understanding psychosis. Social Psychiatry and Psychiatric Epidemiology, 51(7), 937-939. doi:10.1007/s00127-016-1262-z Gerry, L. J., (2017). Paint with Me: Stimulating Creativity and Empathy While Painting with a Painter in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics, 23(4), 1418-1426. doi:10.1109/tvcg.2017.2657239 Gohd, C., (2017). Google Created an AI That Can Learn Almost as Fast as a Human. Retrieved from https://futurism.com/google-created-an-ai-that-can-learn-almost-as-fast-as-a-human/ Hernandez, D., (2014). Artificial Intelligence Is Now Telling Doctors How to Treat You. Retrieved from https://www.wired.com/2014/06/ai-healthcare/ Hirtenstein, A. (2016). Record Green Power Installations Beat Fossil Fuel for First Time. Retrieved from https://www.bloomberg.com/news/articles/2016-10-25/record-green-power-installations-beat-fossil-fuel-for-first-time Kandalaft, M. R., Didehbani, N., Krawczyk, D. C., Allen, T. T., & Chapman, S. B. (2012). Virtual Reality Social Cognition Training for Young Adults with High-Functioning Autism. Journal of Autism and Developmental Disorders, 43(1), 34-44. doi:10.1007/s10803-012-1544-6 Koedinger, K. R., & Aleven, V. (2015). An Interview Reflection on “Intelligent Tutoring Goes to School in the Big City”. International Journal of Artificial Intelligence in Education, 26(1), 13-24. doi:10.1007/s40593-015-0082-8 Liberator, S., (2016). Your days could be numbered if you're a sports writer: The Associated Press is using AI to write Minor League Baseball articles. Retrieved from http://www.dailymail.co.uk/sciencetech/article-3668837/Your-days-numbered-sports-writer-Associated-Press-using-AI-write-Minor-League-Baseball-articles.html Maxmen, A. (2016). Easy DNA Editing Will Remake the World. Buckle Up. Retrieved from https://www.wired.com/2015/07/crispr-dna-editing-2/ Merchant, Z., Goetz, E. T., Cifuentes, L., Keeney-Kennicutt, W., & Davis, T. J. (2014). Effectiveness of virtual reality-based instruction on students' learning outcomes in K-12 and higher education: A meta-analysis. Computers & Education, 70, 29-40. doi:10.1016/j.compedu.2013.07.033 Metz, R. (2016). What It's Like to Try Magic Leap's Take on Virtual Reality. Retrieved from https://www.technologyreview.com/s/534971/magic-leap/ Metz, R. (2016). What It's Like to Try Magic Leap's Take on Virtual Reality. Retrieved March 26, 2017, from https://www.technologyreview.com/s/534971/magic-leap/ Patkar, M. (2014). New Camera-Centric Smartphone App Puts Healthcare in Your Pocket. Retrieved , from http://www.thedailybeast.com/articles/2014/03/28/new-camera-centric-smartphone-app-puts-health-care-in-your-pocket.html Press, T. A. (2016). World's first self-driving taxis start taking passengers. Retrieved from http://www.cbc.ca/news/technology/driverless-taxi-nutonomy-1.3735375 Ratner, P., (2016). Scientists Invent a Device That Can Detect 17 Diseases From Your Breath. . Retrieved from http://bigthink.com/paul-ratner/scientists-create-tricorder-like-device-that-can-detect-17-diseases-from-your-breath?utm_campaign=Echobox&utm_medium=Social&utm_source=Facebook#link_time=1488899841 Regalado, A. (2017). This chart illustrates how AI is exploding at Google. Retrieved from https://www.technologyreview.com/s/603984/googles-ai-explosion-in-one-chart/ Report, P. S. (2011). The World in 2100. Retrieved from http://nypost.com/2011/03/20/the-world-in-2100/ Seaman, A. M. (2016). Completely automated robotic surgery: On the horizon? Retrieved from http://www.reuters.com/article/us-health-surgery-robot-idUSKCN0Y12Q2 Slocum, D., (2015). Surprise! Georgia Tech Teaching Assistant Isn't Human, She's a Robot. Retrieved https://futurism.com/suprise-georgia-tech-teaching-assistant-isnt-human-shes-robot/ Swearer, R., (2016). 4 Ways the Future of Learning Is Changing. Retrieved from https://redshift.autodesk.com/future-of-learning/ Warschauer, M., & Grimes, D. (2008). Automated Writing Assessment in the Classroom. Pedagogies: An International Journal, 3(1), 22-36. doi:10.1080/15544800701771580 Weller, C. (2016). The world's first artificially intelligent lawyer was just hired at a law firm. Retrieved from http://www.businessinsider.com/the-worlds-first-artificially-intelligent-lawyer-gets-hired-2016-5
0 Comments
|
AuthorI am a paramedic educator and educational technology enthusiast. The "medic" part of this blog title simply refers to my paramedic background. Archives
October 2018
Categories |