Ben is 9!

Ben is now 9! I just reread my last proper blog about Ben’s birthday, when he turned 7, and much of it I would like to repeat: I can’t believe he’s 9! He’s so tall! Well done us for 9 years of parenting!

IMG_9311

A lot has happened in the last two years, though in many ways things for Ben and us can seem similar – the appointments, the physio, the school, his interests. When you are there every day you don’t notice your kids growing up, until you catch sight of them across a playground and cannot believe that massive child once lived in your (my) womb.

Two years ago I said Ben had:

“made huge progress on using his eyegaze computer: using it almost every day, knowing exactly what he wants to do, reliably choosing stories and then navigating through them like a pro, using communication software to create messages that were totally appropriate to the moment.”

That was true then. It is also true now: what I want to tell you on Ben’s 9th birthday is that he’s got so much better at using his eyegaze computer (and lowtech communication book). That he often knows exactly what he wants to do, will choose the story he wants and navigate through it, that he goes into and then exit activities when he wants. That he uses his communication software to say things that are entirely appropriate.

Some of what I want to say sounds so similar to 2016 that it begs the question of what  what exactly has changed since he was 7. The complexity of the way Ben learns and communicates means progress, or development, is hard won. It can be inconsistent and hard to capture precisely. I look back on Ben two years ago and I know that he wasn’t as good at navigating around his communication device as he is now. That isn’t to say he wasn’t doing those things before, but perhaps I did see small improvements and leapt on them. I know more now, I expect more of him now. In some ways Ben is doing much the same stuff he was doing two years ago, and in other ways he’s made wonderful progress.

The timescale of helping Ben learn new skills is difficult – it can take a long time. It can be hard to know whether he is improving or learning, and therefore whether you are doing the right thing. It can feel pointless to continue working on something that appears to be having little impact. We knew that communication was a long term project, probably lifelong, but you still can’t help hoping that there will be some kind of miraculous progress at some point.

Which of course there isn’t. But there is slow incremental improvement, a bit more nuance, a little more unprompted communication. It is all subtle and understated. We are talking about complex skills, involving deft eye control, motor planning, memory and language knowledge. It’s difficult to articulate, but I’ll do my best.

Two years ago Ben was using a high-tech eyegaze device that we had bought ourselves – it was a laptop with an eye tracker connected via USB. He could only use it when sat at a table. For the last 18 months he has had an NHS-funded eyegaze device which mounts to his wheel-, school- and home-chairs. He is still quite light, so his weight isn’t sufficient counterweight when the device attached which means it can’t be mounted all the time (e.g. on uneven ground), but he has it mounted on his chair at home and school for some portion of every day. For the last year he has also had additional educational funding so he has two hours of 1:1 time each day when an adult supports his communication specifically – modelling how to say something on his device, helping him find specific vocabulary, helping him use his device to do school work.

These developments have given him more time to practise and more opportunities to find pathways within his communication software. Sometimes progress doesn’t look how you imagined: over the summer he used all of this input to navigate out of his communication software every time I tried to talk to him using it. Every single time. This was disheartening – isn’t supporting his communication meant to mean he communicates with us more?

IMG_9542

To do this he was adeptly selecting three different cells in different positions on three different pages in order to exit his PODD communication software and then go to one of the stories on his device – clearly communicating that he didn’t want to talk to me, he wanted to read (and be read) a book. So I had to admire his determination and skill, and respect his wishes – to force him to talk would be as inappropriate as trying to make a verbal child speak to you.

Since the summer he has chilled out in this particular regard, and will now talk with his device. It is mostly in the context of a specific conversation rather than unprompted speech. It’s getting increasingly relevant and timely, with some ‘scaffolding’ provided by  his communication partner.

IMG_9353

For his birthday Ben’s aunt Rosie sent him some incredible shoes with lights in the soles. Once we’d opened the package and put them on, we talked about them with his device. I went to the homepage of his communication software, where there are various categories of words, and went to Special Events. He then chose ‘happy’ and ‘birthday’, and then I selected ‘present’. I said ‘this present was from…’, and I went back to the home page and into the People folder, where Ben selected ‘Rosie.

I said, ‘yes, from Rosie. And look what she’s bought you!’. I couldn’t remember where exactly to find the word but went into the ‘Clothes’ page. I then got distracted by trying to stop Molly inspecting Ben’s shoes so closely that she risked being kicked in the face. Then I heard Ben, via his device, say ‘shoes’.

He had gone to the folder called ‘shoes’ and then had ignored the cells called boots, thongs (it’s Australian software), AFOs, sandals and trainers to select the one that said ‘shoe’.

IMG_9349

Ben knew what I was talking about, he knew exactly which words we needed, and he found them with his eyes by navigating between pages, selecting the correct one of 18 cells on the screen, repeatedly. It’s functional, practical, appropriate communication. It’s everyday speech, only it has taken years of practice and support for this to become close to our everyday.

It’s one step further forward for a boy who has spent nine years working so hard to make himself understood. He’s 9! I’m so proud!

 

The Perils of the Internet

Like practically everyone in the developed world, I am trying to be more thoughtful about how much time I spend on my phone and on social media. I try, with mixed success, to not spend time on my phone around the kids, and to avoid disappearing into a blackhole of news about people I don’t know. Every once in a while I think about deleting the apps. Sometimes I actually do it, but I can’t quite resist because those clever engineers know what they’re doing and I enjoy the pretty pictures and surreptitious snooping.

But it’s also because I get genuinely useful information and a sense of solidarity from the social media I use. It’s brilliant to be able to make connections with disabled people, to learn more about their experiences and their politics. It’s great to be able to talk to other parents of disabled children. I find out about events, equipment and approaches, from organisations and individuals. I think there is huge value in sharing experiences, hence this blog!

But once you find yourself in this little corner of the internet, there are many stories written by parents of disabled children, and it can be uncertain ground. There is a fine line between sharing experiences and oversharing information about a child who may not be able to consent.

I question myself a lot about what it is okay to write about and what is not, particularly when I read things which I think are inappropriate – perhaps because they show photos which I wouldn’t want to see of me as a child on the internet, or because they dwell on how difficult their life is because they have a disabled child.

I worry that when that child is an adult they will be sad to read what was written about them. I am sometimes concerned that the parent’s account is disrespectful to disabled adults with the same impairments as their child. I am by no means beyond reproach – I am sure I have shared things that I thought were okay at the time, but would now not. Sometimes I think that maybe I shouldn’t be sharing anything at all, but I keep coming back to my conviction that as long as disabled children and adults are perceived as ‘other’ by much of society, there is value in attempting to puncture ignorance with our stories. I try my best to respect all of my children by carefully editing what I share (and perhaps I should share more photos of myself…).

IMG_0678

What I am particularly drawn to are stories about disabled children overcoming communication difficulties, and adults that use Alternative and Augmentative Communication (AAC). It is inspiring to see people who have found the communication system that works for them, and are able to say what they want to say. It’s encouraging to see that methodical, consistent use of AAC can pay off – that children who were unable to communicate have a viable way to do so.

If there’s one thing these kinds of internet stories are good at, it’s celebrating the role of the parent, most likely the mother, in facilitating their disabled child’s access to AAC. Often the mother has fought for the right device, has pushed those surrounding the child to presume competence, has homeschooled the kids when the schools weren’t good enough, has modelled AAC language to their child consistently. The kid is therefore doing really well (possibly writing messages saying how grateful they are to their mother).

And, obviously, these stories are amazing. I want Ben to be the subject of these stories – celebratory, happy stories featuring quotes from a child that found it tricky to use expressive language.

So, does Ben have the right AAC system? Is he getting the right education? Is he getting enough specialist input? Should I be homeschooling him? Am I, personally, doing enough to encourage literacy? Are we modelling enough? Are we doing it every day, in every place, at every opportunity? Because if Ben doesn’t become expressively literate, will it be my fault?

These are the kind of myopic, self-obsessed thoughts I have as I peruse Facebook and it’s not that relaxing. I know I don’t want to homeschool any of my kids – I taught an English camp for Spanish kids when I was younger and I learnt from that summer that I am a terrible teacher. I shouted a lot, particularly when it looked like the kids were enjoying themselves too much. I think there are all sorts of advantages to going to school beyond literacy. But still. The pressure. My god, the pressure.

(Sidenote: if crafting expertise was crucial to teaching literacy, I’d be all over it. Gratuitous World Book Day photo:   )

IMG_1220

And then, occasionally I get a moment of thinking we’re not failing. We’re doing our best, and maybe we’re actually doing okay.

IMG_1492

Today Ben was home from school because he has yet another cold (don’t get me started on the sickness count in this house this winter, it is beyond tedious). Molly was with us, and I was pottering around trying to get stuff done between the nose wiping and Calpol distribution. Molly had pulled Ben’s YES and NO symbols off the velcro on the back of his chair, and she was standing next to him holding them up, saying ‘Yes, Ben. No, Sam’.

IMG_1497

This is how Ben answers questions – he looks at yes and no symbols. She is doing this because two year olds copy what they see around them. She has noticed our modelling and she is using AAC with her brother. It’s a little bit magical. We must be doing something right.

The temptation to interrupt

IMG_6909

If you have a child who doesn’t follow the typical path it’s difficult to have a sense of where they will end up. I don’t mind this too much; I don’t find it that useful to have conversations about what Ben may, or may not, be doing in ten years time.

But everyone looks for role models for themselves or their kids, and some sense of where the path might be going. Disabled adults are rarely found in mainstream media, so I was lucky recently to be at a study day where a panel of five Alternative and Augmentative Communication (AAC) users answered questions from the audience using high tech devices. Three of the five used eyegaze, exactly like Ben is learning to. There is a video here.

It is important, and inspiring (not in an inspiration-porn kind of way) to see people using AAC to talk so eloquently. THAT is where we want to aim for.

But I was really struck by something that one of the panellists – Kate Caryer – said at the event: she pointed out that people sometimes think of a communication aid as a gift or a toy, that users should feel grateful their local authorities have provided. Whereas it is in fact a human right.

Respecting Ben’s right to communicate means his device needs to be there, in front of him, as much as possible, not just when we decide we can fit it into his daily life. We need to make sure the batteries are charged, and the mounting arm is ready when it is needed. We need to force ourselves to fit it even when it feels like a bit too much effort on a busy day.

IMG_9107

But more profoundly, we have to be ready to hear what is being communicated. We have to alter our own culturally-constructed notion of how conversations work.

I find it uncomfortable to leave gaps in conversation – I feel I have to fill them. I first realised this when we lived in the Middle East and would spend time with Syrian friends who were happy to sit companionably with long pauses in conversation. I had to teach myself to enjoy this time and not fill the gaps with mindless waffle. This was made easier by my rudimentary Arabic.

Someone using a communication aid to talk is probably going to take longer, more time than a typical talking person. At the study day, it took time for the five individuals there to answer questions from the audience: they need to hear the question, navigate (with eyes or fingers) around their communication device to find the vocabulary for what they want to say, and then speak it.

This often makes conversation with an AAC user slower than we are used to, and I for one need to force myself to accept that rhythm of speech. This means waiting the extra 30 seconds to see what your conversational partner wants to say, and not interrupting. Ben isn’t able to shout immediately, in the way that Max does frequently, ‘Mummy, I was actually in the middle of talking when you interrupted me!’

I frequently screw up even when I am trying my best. I realised recently that after a conversation with a woman who uses a communication aid, I had said goodbye and walked away. I hadn’t waited to see if she wanted to say goodbye, or even if she had anything else to say. I felt a little sting of shame when I later realised. We can all have good intentions, but we don’t always behave as well as we’d like.

If you manage to fully embrace the alternative pace, there are rich rewards. A few years ago when Ben was just learning to look at Yes and No symbols on the armrests of his wheelchair to answer questions, we went to a local park and met some friends. One of them, who happens to be a nurse, crouched down in front of Ben so she was at his level, asked Ben if he was enjoying his new school, and then waited.

Ben slowly and deliberately looked down at the Yes symbol. It was the first time he had totally independently answered a question from a stranger, and it happened because she asked the question in the right way for him.

If we accept that people with communication difficulties have a right to talk – and therefore to be given the support, equipment and training they need – then they also have the right to be heard. And we, the people taking our communication skills for granted, have to learn how to listen. Not make assumptions about what is being said, or interrup, or fill the gaps in conversation with inconsequential waffle, but actually listen.

I mean it’s a good tip for life generally – many marriages could benefit from partners actually listening to what each other are saying. But rather than forcing the AAC user to navigate their way through their devices to say ‘Stop interrupting me’, maybe we should just take it upon ourselves to get our own houses in order.

 

Let him shout (at Trump or anyone else)

In the dim and not-so-distant past when it seemed impossible Obama would be replaced by a misogynist as the President of the USA, I watched a video of someone protesting at a Trump rally. The internet was full of footage of protestors at Trump rallies, and inevitably Trump insulting the protestors, but this was different. Different because the protestor was a 12 year old boy, who has cerebral palsy, who uses a wheelchair, who talks using a communication device.

JJ Holmes lives in America and had been following election coverage by using his iPad to search for Trump events, typing the words in using his nose. He knew Trump had mocked disabled people and eventually convinced his mum to take him to a Trump rally so he could protest against him. She warned him it might get ugly but he wanted to go anyway.

He pre-programmed phrases into his computer before they went, so that at the rally he could shout slogans like ‘Trump mocks the disabled’ and ‘Dump Trump’. He could play the messages through his communication device by pressing a button with his leg (there’s a brilliant video of him using the button here. As it wasn’t that loud, his mother and sister chanted along so he couldn’t be drowned out too easily.

After a bit the supporters surrounding them turned rowdy. Trump heckled JJ from the stage and told security to ‘Get them out’, and they were jostled out amidst JJ’s wheelchair being shoved and some pretty horrible things being said to all three of them.

So…

I mean obviously there’s much to be depressed about in this little anecdote, and that’s before we even knew Trump would become President.

But out of the murk I find inspiration: a 12 year old disabled boy convinced his mother to take him to a political rally to protest Trump’s attitude toward disabled people! He programmed chants in to his communication device!

I can’t tell you how proud I would be to have raised a son who was so politically aware and knew the power of protest. A boy who knew he was disabled and knew that was okay. Who knew that Donald J Trump (as JJ refers to him) was wrong to belittle disabled people. Who was brave enough to go to an adult event and make himself heard. Who was willing to be heckled and shoved to make his point.

I would be so proud of my son for learning to communicate with his nose and his leg, and having the patience and determination to make himself heard.

I am filled with admiration for a mother who took her 12 year old seriously, and facilitated his protest even though she knew it could get ugly. Who told journalists, ‘He’s not some puppet I wheeled in there. This was him – this was all him.’ I would be proud to be such a staunch ally. To treat communicating through a device as equal to talking (or shouting). And to have raised a daughter who wants to protest too.

I’m proud of a world where disabled people are helped by technology and supported to communicate when they can’t speak.

What the whole story relies on is JJ’s ability to shout (even if his device’s voice output isn’t quite loud enough to be heard above enthusiastic Trump supporters).

Ben has been using his eye-gaze computer for some time now and is beginning to create messages or questions within his communication software, or he uses the computer to read himself stories. We try to give him as much autonomy over the computer as possible (whilst hovering around, interfering, facilitating and modelling) so he can choose what he wants to do. When he chooses to use it for communication, we take the messages he says seriously.

By ‘says’ I mean that as he selects words within the communication software, they go in to a window at the top of the screen. When Ben goes to the ‘speak’ cell the computer says all the words out loud. When he is reading himself a story he selects the ‘speak’ cell and a paragraph of, for example, Mr Stink by David Walliams, will be read aloud by the computer.

img_8709

But the loudest volume of Ben’s computer isn’t that loud. He can hear it as he is quite close, but you can’t really hear it if you are across the room, and you definitely can’t hear it if Max or Molly are squawking. So Ben’s speech and language therapist recently gave us a speaker to sit on top of the laptop, the volume of which can be controlled within the communication software. It can be turned up really loud, because if you are going to give an AAC user the opportunity to communicate as a neurotypical child would, you need to give them the ability to SHOUT if they want to. Or whisper. Or somewhere inbetween. Ben should be able to compete in volume with his siblings even if it’s not something I particularly relish the thought of.

In the same way that I ask Max many, many times a day to please not shout, Ben should be able to be loud and annoying. There is a temptation to think of children like Ben as ‘good’ because they are relatively quiet and controllable, but part of being a child (any child) is being disruptive and protesting and Ben should have as many opportunities to do that as he has to be compliant. If we get to the point of Ben purposefully turning up the volume of his computer and shouting at us all, and me having to tell him to turn it down, that will be a good problem to have.

Meanwhile, JJ’s recovery from the Trump rally ordeal was helped by him meeting Obama the following day who was every bit as friendly as you would expect, crouching to his level, listening to JJ’s messages and shaking his hand.

imrs-php

(Holmes family photo taken from Washington Post website)

According to a Clinton official, when he was told he was about to meet Obama,  ‘JJ jumps out of his seat and erupts into cheer . . . his smile almost bursting out of his face. His body overcome by light, when just the day before it almost succumbed to hate.’

Let’s keep heading toward the light, people. Let’s confront bullies and prejudice. Let’s give all our kids the ability to communicate, to shout and to protest. Let’s take them seriously and hope that one day we might meet Obama.

img_6179

How we learn to talk

IMG_6819

I have just done a two day course on a system that I hope will help Ben communicate with us and everyone else.

Communicating is one of Ben’s biggest challenges. He has a lot to say, but his severe disability hampers all of his efforts to tell us what he thinks or wants: he can’t speak, he can’t point, yet he understands a lot.

He has found highly effective ways to get round this day-to-day. He can smile or frown, laugh, shout or show us his bottom lip (the degree of protrusion of the lip is directly proportional to his sadness, on a spectrum that ends in full-blown sobbing). Whingeing indicates dissatisfaction. Looking expectant means ‘I’m interested’.

Ben has the intellect to express far more sophisticated ideas and desires than this – but he needs the tools. So we are working with Ben’s school and a number of speech and language therapists to maximise his opportunities, using Assistive and Alternative Communication, or AAC, the term used to capture a whole range of communication systems that don’t rely solely on speech.

Everyone involved in this effort comes together at a termly review meeting. Last week that involved eight professionals sitting around a table for several hours discussing progress. The combined level of expertise is immense.

And partly because of all of this input, Ben is doing brilliantly. In last week’s meeting someone pointed out that (PROUD MUM ALERT) Ben’s spelling is on a level with his non-disabled peers, so it’s important to not push him too hard. He already has to work much harder than a typical kid to spell consonant-vowel-consonant words but he’s doing it – on an eye-gaze computer or by looking at magnetic letters with Granny.

141016 Ben eye gaze

The major new communication concept we are introducing to Ben is a PODD book, developed by an Australian woman called Gayle Porter, a minor celebrity on the aided language scene.

PODD books are full of symbols that you point at to express yourself. The ambition is that Ben will eventually be able to show us which symbols he wants to ‘say’ (video of a child using PODD here).

IMG_9118

The success of using aids like PODD books is totally dependent on the expertise of the people surrounding Ben. The course I was on last week was an attempt to increase my confidence and knowledge.

I came away enthused. This has huge potential for Ben. But at the same time I’m nervous, intimidated even. I took away three main insights from the course that really highlighted how big an endeavour this will be:

1. Do we have the stamina to do this justice?

A typical child hears spoken language for at least 18 months before they start talking (over two years if you’re Ben’s brother Max). It therefore follows that we should be modelling use (otherwise called ‘receptive input’) of the PODD book to Ben for at least a year before we expect him to be using it with us. If we were really to mimic normal acquisition of speech, James and I would use it to talk to each other in front of Ben.

At the course we were shown totally inspirational videos of young children communicating, in ways that just would not happen if they hadn’t been taught to use PODD. But … in some of the videos, their mothers (and they were almost exclusively mothers) had modelled use of the PODD books with their children for over a year before their children started interacting with the book.

The level of skill of these women is impressively high – they can navigate expertly from page to page and find symbols really quickly. But even more important is their level of determination. They have dedicated years of their lives to teaching their kids to use PODD books, even when the kids weren’t apparently engaging with them. It is hugely inspiring, but also massively daunting. What if we’re not committed enough? What if we get bored and demoralised, and that means we fail Ben?

2. You can’t learn a language if you’re placed on mute

If you think of the PODD book as Ben’s way of talking, then you have to have it available to him all the time. Otherwise you are effectively silencing his voice. It is close to the equivalent of gagging a verbal child to stop them talking.

We are already guilty of doing this: we introduced Yes and No symbols on Ben’s chair so he can look at these to answer questions. A few weeks ago, the No got puked on, the sellotape got wet and the symbol fell off. Despite daily good intentions, I still haven’t got round to sticking it back on. And I haven’t got round to putting the Yes and No symbols on his wheelchair.

I introduced an opportunity for him to tell me things and them I took it away. Shit.

3. This needs to be a whole family enterprise

On the course I also saw videos of young children using PODD books really competently with their disabled siblings. Max is really trying to work out the rules around him at the moment and where Ben fits into it: for example, ‘big boys’ walk and babies don’t, but Ben is a big boy and doesn’t walk (cue confused/amazed face). We’ve had discussions about whether Ben can hear Max talking, because normally people talk back when they hear you, but Ben doesn’t talk.

Max already talks to Ben on his terms – he says ‘night night’ to Ben every night and waits for Ben to look at him because he knows that’s Ben way of saying goodnight back. When Max asks Ben a question he says, ‘Ben, can I play with your Peppa toy, YES OR NO’!

IMG_4770

Max could easily pick up the way a communication book works (the photo above shows him at 18 months getting interested in one), but only if he sees us using it. We have to normalise it within our family, even if the wider world insists on only speaking to each other.

I should reiterate that I really am filled with enthusiasm – the PODD system has been thoughtfully developed and children exactly like Ben have learnt to use it proficiently. But, oh my goodness, what a weight of expectation. Remind me of this blog in a year, when my enthusiasm for pointing at symbols might be wavering and I need to remember that we knew we were in it for the long haul.