For eight years I have worked with a wonderful student named Caleb, who has multiple disabilities and Cortical Visual Impairment and is served as a medical homebound student due to his fragile medical status. As a result of his cerebral palsy and severe speech/language disability it has been difficult at times to assess Caleb’s true cognitive abilities and create meaningful instruction for him. In his preschool and early elementary years, his team members tried many different instructional materials and communication systems with Caleb, including the use of PicSyms picture symbols using Boardmaker™ software. However, Caleb did not respond to illustrations or photographs at that time. I knew that Caleb had definite preferences for certain toys and sounds in his environment, so I tried to think of ways to incorporate those items in instructional and communication materials. I took photographs of a few of Caleb’s toys, including both preferred and non-preferred toys, placing them in front of a black felt board (in this case, the Invisiboard by the American Printing House for the Blind) so that there would be no visual distractions from the main object of the photo. At that time, I also made recordings (using the voice memo app on my phone) of the sounds which some of the toys made. Examples of these sounds included a bouncing ball and the various sounds that were generated when Caleb played with his favorite toy, a shape sorter toy made to look like a barn on a farm that plays music and makes clacking noises when the child places square blocks in various holes on the toy (picture below). Below is a picture of the barn toy.
Once I had the pictures and sounds saved, I used the “See.Touch.Learn Pro” app on an iPad to start presenting the pictures and sounds together. This is a very powerful app that allows the user to create an infinite number of customized lessons for any subject and any developmental level. The Pro version also gives the user access to thousands of lessons created by other users around the world. The Pro version is expensive (I bought it when it cost $30.00; the current cost of the app on the iTunes store is $50.00), but I feel it is well worth the cost for the flexibility it gives me to create iPad-based lessons for any student.
I chose to use the pictures and sounds on the iPad instead of presenting a printed photo along with the actual toy because Caleb was very visually responsive to cause-and-effect activities on the iPad, and I felt this presentation would be more meaningful to him than showing him a printed photo. In the first lessons I created, I used only pictures of preferred toys. One picture was presented at a time, and within the app I recorded my voice prompting Caleb to “Touch the (ball, barn, etc.).” Along with the prompt I recorded the sound of that toy, either by sounding the toy itself or playing the voice memo recording on my phone. I then used hand-over-hand assistance to help Caleb touch the picture, which counted as a correct answer in that lesson. When a correct answer is selected in the app, a sound is played (various sounds, such as a ding, buzz, or “yay” can be selected in the settings of the app). Eventually I was able to fade the hand-over-hand assistance and Caleb was able to touch the single picture himself. One reason I have really liked this app is that when only one or two pictures are presented at a time, they are large and easy to touch for students who have difficulty with fine motor control such as Caleb. Touching the picture itself is the response; there are no buttons that the student has to find and touch.
After many trials, Caleb was consistently able to touch the single picture, and he often smiled or laughed when he heard the sounds of his favorite toys. At this point, I created a lesson in which two pictures were presented at a time, one picture of a preferred toy and one picture of a non-preferred toy. A picture of a non-preferred toy is shown below; the toy is a stuffed frog that makes various sounds when certain parts of the toy are squeezed or pushed. Caleb did not have the fine motor control necessary to activate these sounds, so this toy was not much fun for him to play with.
The prompt for each exercise of the lesson was the same, using the “Touch the…” prompt along with the name of the toy and its associated sound. Hand-over-hand assistance was again provided to help Caleb learn to touch the picture indicated by the prompt. Eventually the hand-over-hand was again faded, and Caleb gradually increased his consistency of correct responses. We were all very excited at this point because it was clear that Caleb had made the breakthrough of associating pictures with concrete objects! After this success, over time we were able to use the same methods for introducing academic concepts, such as shapes, colors, and alphabet letters. Although those types of lessons were not as fun and motivating for Caleb, he was still able to demonstrate cognitive concepts by choosing correct answers. It should be noted that eventually I started using a customized prompt of “Show me the…” to be consistent with the language used in our state’s alternate standardized assessment.
Due to the success we had with presenting pictures and sounds in the See.Touch.Learn Pro app, I used the same methods to create stories about objects, sounds, and people which were meaningful in Caleb’s world. For these stories I used an iPad app called “StoryCreator Pro”. This is another app that I have just loved for creating personalized stories for students. However, I am sorry to report that in the most recent iOS updates the app has lost the functionality of sharing the stories by email or by backing them up online, which has caused me to stop using it. Although creating stories in the app is very quick and easy, it has not been practical for me to create them directly on the student’s device during the service time I have allotted for a student. Originally, I could create the stories on my iPad at home, then email them to the student’s device or use online backup with my account. I am still looking for a replacement app that will do everything I was able to do in StoryCreator Pro, but I have not yet found any app that I am happy with. I have begun learning to use the iBooks Author app, which is only available for the Mac OS, and it seems to be pretty flexible and easy to use, but at this time the fact that it does not allow the embedding of sound is a real disadvantage. However, I am in the process of creating storybooks in iBooks Author with repeated lines that can be used with a recordable switch for shared reading, which is one of Caleb’s current IEP goals.
In any case, by using the StoryCreator Pro app we were able to create meaningful stories with sounds for Caleb, and he continues to enjoy these. In fact, we have been working on choice-making using flash cards by placing one word representing each story on a card and giving Caleb the opportunity to choose which story he wants to read by indicating the card by eye gaze or touching. One of my favorite stories was created and narrated by Caleb’s mom and his two sisters. Below is a screenshot of one of the pages from this book. The picture shows Caleb’s toy car, with the line, “This is Caleb’s red car. It goes Vroom, Vroom!” The recordings of his sister’s voices at this point in time (especially the sister who has a flair for the dramatic) are a real treasure, and I saved the recordings on a CD and gave it to Caleb’s mom so that she will have the recordings as a keepsake.
As another example of the value of using meaningful content in instruction, below is a screenshot of a page from a story I created, called “Lawn Mower!” The picture shows a picture of a red riding lawn mower. The mower is similar to the mower that Caleb’s family owns, and Caleb loves to ride on it with his dad or grandfather. Caleb’s mom took a video of Caleb riding on the mower and shared the video with me, and I was able to use the audio from the video to add to the narration of the story. This audio clip is especially fun because you can hear Caleb yelling with joy as he rides. The line on this page reads, “This is what a mower sounds like.”
Again, Caleb loves this story because he gets to see a picture and hear sounds associated with an activity he really enjoys. The story would probably elicit no reaction if I showed it to another student. Through my experiences with Caleb, I learned so much about “thinking outside the box” and the importance of assessing a student’s preferences when teaching students with CVI. Granted, this type of individualized instruction does involve a certain amount of preparation time and assessment of what really “floats a student’s boat,” but if it can help reach a student and provide motivation for activities it is well worth it. In Caleb’s case, periodically I would just ask his mom, “OK, what is he really into now?” Also, once you have a saved library of meaningful pictures and sounds for a student, you can use them in a variety of ways. It is also easy to add new pictures and sounds to the library on the spot with the use of a smartphone or other mobile device. I hope that the ideas and methods presented in the post serve as a jumping-off point for other teachers to think about teaching students with CVI in a new way. And if anyone finds a story-creating app that rivals StoryCreator Pro, I’d love to hear about it in a Paths to Technology post!