Hello, you’ve reached tech support

Reflections on using bespoke homemade technology for a live online show.

Hello, you’ve reached tech support

Last month we did our first online run of The Evidence Chamber, an interactive courtroom drama that each player joins remotely through their internet browser. To make this happen, we used our custom-built platform to carry the player through each part of the show: the trial’s 'evidence' documents, video chat among players, votes and polls, and a debrief with the show’s makers. It might not sound like anything groundbreaking, but it was the hardest thing I’ve ever done.

In face-to-face shows of times gone by, we used the same platform to run the same show - but things were very different. Each player used one of our iPads, which were all the same, and had been obsessively tested beforehand. When the iPad did something unexpected, the player just had to look around the room a bit confused, and one of us would rush over to fix the problem. Oh, those were the days.

With a remote online show, each player is joining from a different browser, laptop, webcam, and location - as well as bringing vastly varying tech skills. Playing with 12 jurors plus me, Rachel and Dan running the show means you needs a tech system that enables the near-seamless cooperation of 15 different devices in 15 different locations operated by 15 different people. (Not counting when our forensic scientist collaborators sometimes join for the debrief.) 

Because the show is dependent on this delicate web of technology, a lot of the most important building I did happened after the test shows, and even early in the run of the public shows. During that time, I found myself hunched over the keyboard muttering a few recurring thoughts to myself – mantras if we’re being generous. I’ve noted them down here, in the hope they’re applicable to anyone else experimenting with new ways to make remote participatory online experiences.

Another delicate web

Building a video API is really hard

Let’s get this specific one out the way first, since I actually realised this quite early in the process. In fact, building a video API is so hard that we didn’t do it. With the amount of time we had and the complexity of the task, it just wasn’t feasible. It would’ve been a huge task to provide compatibility across all browsers and computers, including video fallback (when the system is able to load lower-quality webcam streams if the user's internet is bad). Instead we paid for a video API service, which was kind of a bummer. As we’ve mentioned before, we’ve been measuring and reducing the ecological impact of our shows, but this third-party video API is a black box; there’s no data or analysis on how much energy it uses. Sure, we can estimate (each show is about the same as a 12-person Zoom meeting) but it’s not as satisfying as working with the raw data. And aside from the ecological constraints, incorporating a third party tool meant that whenever we saw video-related tweaks that could be made within the show to improve the player's experience, they were either very hard or impossible to implement. Very frustrating.

But in other ways, using an existing video API was great. Compared to anything I could have cobbled together in the last few months, the off-the-shelf API used much less processing power at the user’s end, meaning that many more people could play because we didn’t have to put in restrictions based on how powerful your computer is. And it came with quality assurance across various browsers up to 8 years old, which would have taken me ages to build. The alternative - ensuring each player has the latest version of a few select browsers that work with a DIY API - doesn’t sound that appealing.  So in this case, accessibility won over sustainability and artistic capacity. I’m still not completely comfortable with this, but as Rachel wrote in her reflections on the show, we’re learning not to be perfectionists.  Don’t let the perfect get in the way of the good, or whatever.

Me in March 2020, wondering how I'll ever build this damn video API

Expect tech issues

With this set-up, there are so many variables in each show that it’s not really useful to start troubleshooting before the live show begins. We realised we had to build the system for when things go wrong, not if things go wrong, which was a tough shift for us mentally (as Rachel wrote). In my case, it was brutal because we could only find out what wrong once players used their tech to join the show. All of the fixing has to happen semi-publicly, when of course we’d prefer to have a shiny thing all ready to go the moment people arrive.

But that just wasn’t possible for a number of reasons. For example, internet connections are still pretty unreliable. Although we pointed prospective ticket buyers to internet speed testing services (which also aren’t that straightforward), one tenth of all players didn’t have enough upload speed to join the video chat in the show. Other players disappeared momentarily when their internet dropped out, so we thought carefully about how you could seamlessly re-enter the show. We’d played other online experiences where you had to re-enter login details, or start back at the home screen on re-entry, which was incredibly frustrating when you’re missing out on the action because of dodgy WiFi. Also, because I told the system to expect a full jury box of 12 players at any time, I had to stop it from freaking out when it couldn’t find all 12 jurors at every moment (I actually used a manual bypass for this).

The Asymptote of Perfection, by Matthew Smith

Improve your visibility over the whole system

With a remote show, I have limited visibility over whether anything is a user error or a system error, but I realised after some test runs that there was some scope to improve this. For example, I built a tool to pick up common-but-unhelpful “rules” set up on the user’s device. In a lot of cases, people many moons ago will have set up their internet browser to deny webcam access by default, which is useful if you don’t want to be spied on by the FBI, but isn’t useful if you’re trying to join our video chat. So their video wasn’t working, but their computer is silently doing what it’s been told to, and the user hasn’t done anything “wrong” - all of which makes it hard to pick up an “error” by human-to-human communication alone.  But my little tool could sniff out this “rule” in seconds, and then I could talk to the player about how to change their browser settings.

I also built a tool which would send me full error details and crash-logs whenever the system crumbled at the user's end. This was really rare but it helped me understand weird browser and operating system quirks early on, so by the fourth public show I’d re-written the system robustly enough to get these instances down to zero.

In general, with enhanced visibility I could be more proactive with helping people during the show.  A lot of people don’t want to cause a fuss when they have an issue, so they try to carry on regardless and then get a compromised experience because of something that takes thirty seconds to fix.

My visibility the day before the first test show

Tech problems can have human solutions

In another Fast Familiar project, we’ve been creating a (very fictional) show about an airline that claims to passengers that their whole operation runs on the latest artificial intelligence technology, but it’s actually one poor flight steward frantically sending voice notes to his teammates sat behind a huge control panel back at HQ. Sometimes I think the idea came from our own shows.

In anticipation of tech issues like those I mentioned above, we built some buffer time into the beginning of the show so that I could make sure each person was set up correctly behind the scenes. At first, this was a screen on our platform with text explaining that we were just waiting for other players to join. During test runs, we realised that people didn’t really trust this screen – they usually suspected the tech was broken for them and they were “stuck” there, in purgatory forevermore while everyone else had a great time in the courtroom. We could have added something that felt more “live” – an automated counter showing how many players were ready to play, or a spinning wheel, for example.  

But actually, in place of that whole screen, we nominated Dan to become Stan the Court Clerk (complete with costume – fresh haircut and ironed shirt), who welcomed people live via video camera. He could troubleshoot some of the more basic issues by chatting to players, and then with my visibility tool mentioned above, I could see who needed additional help, and then ask Dan/Stan to ask those players to contact me in my private room where I could walk them through the steps. We could have served players with an interactive troubleshooting flowchart, or a video tutorial, but I honestly think the way we did it was more efficient and much more welcoming.

Spinning wheel: not very reassuring.

Try to make audiences feel comfortable

Once Dan/Stan had sent players to me, I had to actually figure out what was wrong and how they could fix it. I needed to be calm and patient while I did a lot thinking on my feet, a lot of quick googling, a little stalling for time, and a lot of techy language translation. We really didn’t want to alienate people as soon as they'd joined the show, so I had to learn to pitch the conversation at the right level for each player. I definitely improved at this as the shows went on (I think).

We also found that during the show, even when nothing was actually going wrong (which was most of the time), people still needed a lot of assurance that nothing’s going wrong. We’re wary of unfamiliar tech (and familiar tech for that matter). It’s pretty justified – think of all the hiccups and delays that happen during a single Zoom meeting. But usually the tech won’t break itself; it will be broken by people freaking out and trying to fix it or get around it in unexpected ways. When people feel on edge with the tech, it becomes a distraction from the actual content of the show, which is the last thing we want.

The original king of pro-active tech support.

Try to make yourself feel comfortable

So we spent a lot of time making audiences comfortable with the tech that’s fuelling the show, but what about us as the people running the show? After the initial flurry of problem-solving, I hover over the mouse with my nose pressed to the screen the entire show, just waiting for the next blip. It makes me tense and anxious, which is pretty exhausting even if there are no actual disasters. Equally, Rachel and Dan rely on me to translate (and fix) any tech problems, which can leave them feeling frustrated and helpless when an issue arises. It felt really unsustainable, and made the idea of doing another run in the future quite unappealing. To make the experience less of a white-knuckle ride for us, I built several small tools that could reassure us about what each player was experiencing.  (Sadly, not actual mind-readers, although I’m working on it.)

For example, I built something to tell me the live time-stamp of each player’s video playback during the show. If someone’s taking a suspicious amount of time to complete the video-watching section, I can check whether they’ve just re-wound for another viewing, or whether they’ve been “stuck” at the same timestamp for a while (in which case something might have gone wrong and I can contact them). I also built a tool that can see whether someone has cast their electronic vote at one of the poll moments in the show. Players can’t leave that section until everyone’s voted, so I can contact a seeming 'abstainee' directly and find out what’s up to (hopefully) keep things moving.

I also made a tool that could show whether each player's documents and videos have loaded properly at any moment during the show. During group discussions, we found that sometimes people will say that a video skipped or a file didn’t load when the conversation made it clear that they’d missed a key bit of evidence. At first our hearts sunk when we heard this, but once we had these little tools in place we could see that they’d chosen to cover up a human error with a tech issue – which is just fascinating now that it no longer makes us panic.

Make your tech support feel like this. Minus the screen glare.

I think so much of this comes down to: how do we make everyone at least as comfortable and welcome as they would be in a physical venue? OK, they’re by no means perfect spaces, and I think online shows offer a lot of opportunities and benefits that I haven’t seen manifested in bricks and mortar. So maybe I’ll narrow the question: how do we not lose the feeling of being surrounded by people?  

For me, a lot of the answers relate to ‘The Arts’ being able to build our own tech tools (or rather, our own environments).  In a physical building, artists and venue staff make a million decisions to engineer a social space, but digital realms can leave us feeling much more constrained. Of course, building your own virtual venue is much easier said than done – it takes a lot of time and skill-sets that aren’t built into most arts orgs, and aren’t accounted for in a lot of funding opportunities.  But that’s a whole other discussion...