Table Of Contents
Insights
7 min read

Max’s story: What being blind has taught me about building more accessible software

Max, a software engineer at Beam, shares how his experience of sight loss shapes the way we build technology at Beam, and why inclusive design must be built in from the start.

Technology is at the centre of everyday life. We use it to manage our money, to shop, communicate with loved ones, and do our jobs.

But for millions of people, technology can also be a barrier.

If software isn’t designed with accessibility in mind, everyday tasks can suddenly become difficult, or even impossible, to complete.

I know this first-hand. I’m Max, a software engineer at Beam, and I’m blind. Navigating software looks very different for me. Instead of scanning a page visually or clicking through menus with a mouse, I rely on assistive technology to understand what’s on the screen and move through it.

And I’m not alone. In the UK, around one in four people live with a disability, and many rely on assistive technology to access the digital world. From screen readers and magnifiers to braille displays and speech recognition. 

If software doesn’t work with these tools, people can’t access it. And too often, those barriers appear without developers even realising.

That’s why accessibility needs to be built into technology from the start. In this blog, I’ll share how my experience has shaped the way I think about building Beam’s AI products.

My perspective as a blind engineer

I wasn’t always blind. I was born with glaucoma, a condition that gradually damages vision over time.

I never had sight in my right eye, but I had enough vision in my left eye to manage most things growing up. 

At school, reading from the whiteboard was difficult and handwriting was slow and frustrating. Fortunately, I received SEND support, and a teaching assistant helped by reading textbooks aloud, enlarging materials, and copying notes from the board. 

At university, I used an iPad to follow lectures and a student took notes for me. 

But when you enter the world of work, that level of support often disappears. I was lucky that my first employer was understanding. They set me up with a desk in a shaded corner and large monitors that made it easier to work.

As my vision continued to deteriorate, reading from screens became harder and bright light could make it difficult to focus. I started relying more on assistive tools like screen magnifiers, dark themes, and contrast settings to make text readable.

These tools let me move through code, documentation, and interfaces in a completely different way.

In recent years, AI has been transformative. Tasks that once felt daunting, like finding a specific detail buried in a huge document, are suddenly manageable.

In many ways, technology has kept pace with my sight loss. Where my eyes have struggled, tech has picked up the slack.

That’s why I care deeply about building technology that is accessible to everyone. 

How Beam Notes helps social workers with accessibility issues 

While I spend my days writing code, frontline practitioners spend theirs supporting people. But alongside conversations with people, they spend a lot of time documenting interactions and completing reports.

Taking notes during a conversation isn’t easy at the best of times, but it becomes even harder for practitioners with accessibility challenges such as dyslexia, visual impairments, or neurodivergence. 

Notes turns in-person or remote conversations and documents into structured, accurate case notes, allowing practitioners to focus on the person in front of them.  

This removes the pressure of writing in real time or spending hours on reports afterwards, creating a more level playing field for practitioners with different needs.

But if we’re building technology to improve access, we also need to make sure the software itself is inclusive.

Beyond the checklist: Our philosophy 

At Beam, accessibility isn’t a feature we "bolt on" at the end, it’s the foundation of how we build. While many teams treat the Web Content Accessibility Guidelines (WCAG) as a final exam to pass, we use them as our north star from day one.

These guidelines focus on making software:

  • Perceivable: Can everyone see or hear the information?
  • Operable: Can it be used without a mouse?
  • Understandable: Does the flow make sense to everyone?
  • Robust: Does it play nice with other assistive tools?

But meeting these standards isn't just an "engineering problem." It involves everyone, from designers and product managers to leadership. Every decision, like the shade of a button or the structure of a menu, dictates whether a user feels included or ignored.

We don’t rely on assumptions, either. We work closely with the people using the software, because real users highlight the invisible barriers that a developer sitting behind a desk might never see.

This isn’t a one-off effort. As software evolves, new features can unintentionally introduce friction, even small code changes can create blockers.

We use automated testing aligned with WCAG to scan for common issues like:

  • Poor colour contrast
  • Missing alternative text for images
  • Incorrect heading structure

They help us catch problems early and maintain a strong technical foundation, but they’re only part of the solution.

Why automated testing isn’t enough

One of the biggest challenges is the invisible layer of the web, where people navigate software using tools like screen readers.

Screen reader users interact with a page very differently from sighted users. Instead of scanning visually or clicking around with a mouse, they move through content element by element using keyboard commands.

The screen reader announces what’s on the page, reading text aloud, describing images, and explaining the structure of the interface. It also needs to be told when something changes. 

For example, when a dialogue box opens, sighted users immediately see the new window appear. But a screen reader user needs the software to announce that something has changed and move the focus into the new dialogue. If that doesn’t happen, they may not realise the feature exists.

In practice, this can block someone from accessing key parts of the product. Menus, forms, and buttons can all become unusable.

This is the kind of issue automated testing often can’t detect. Testing tools only see the underlying code, they don’t experience the product the way a screen reader user does. 

So what’s the solution?

As the people building these systems, we need to experience that hidden layer of the web ourselves.

One of the most effective things developers can do is try using a screen reader. It can feel unfamiliar at first, navigating entirely by keyboard and listening to the interface rather than seeing it. But it quickly reveals issues that would otherwise be invisible. 

Once you start testing this way, you can move through the product just as someone using assistive technology would, and spot where things break or become confusing. 

Because the truth is, we can technically meet all the WCAG requirements and still deliver a poor experience.

If there’s one thing I’d encourage other engineers to do, it’s to get hands-on and try it yourself. 

Walk in my shoes.

Building technology that works for everyone 

Accessibility isn't a checklist, it’s a commitment to not leaving anyone behind. 

At Beam, we’re not  just building technology to be 'compliant', we're building it so that every frontline worker, regardless of how they see or interact with a screen, can focus on what matters most: the person sitting across from them.

Explore how Beam’s AI solutions can help make frontline work easier and more accessible.

Author:
Alex Stephany, CEO of Beam
Published:
Previous
There is no previous post.
Up Next
There is no next post.

Brighter services for humanity

Equip your frontline teams with bespoke technology. Empower them to support people with more humanity.

Book a call with Beam