Rural Internet: Starlink Outage Data

In my last post, I talked about how frequent, short outages prevent video calling from being comfortable on Starlink. If you were curious about exactly how short and how frequent I meant, this post is for you.

Starlink’s satellite dish exposes statistics that it keeps about its connection. The small “ping success” graphs I shared in the last post are visualizations provided by the Starlink app, which are driven by these statistics.

Thanks to starlink-grpc-tools assembled by sparky8512 and neurocis on Github, I have instructions and some scripts to extract and decode these statistics myself. I haven’t been great at collecting the data regularly, but I have six bundles of second-by-second stats, each covering 8-12 hours. (February 1 saw a couple of reboots, so the segments there are approximately 7.5 and 11 hours, instead of 12 for the other segments.)

The raw data exposes a per-second percentage of ping success. It’s somewhat common for a single ping’s reply to go missing. Several pings are sent per second, though, and one missing every once in a while is mostly no big deal. The script I’m using tallies the number of times /all/ of the pings within a given second went missing (percent lost = 100, or “d=1” in the data’s lingo). It also tracks “runs” of seconds where all of the pings in contiguous seconds went missing.

Figure 1: count of each length of outage.

These first two graphs (Figure 1) explain what I mean by “frequent” and “short”. This histogram displays one bar per “run length” of all-pings-lost seconds. That is, the left-most bar tracks when all pings were lost for only one second, the next to the right bar tracks when all pings were lost for two consecutive seconds, the third bar tracks when all pings were lost for three consecutive seconds, and so on. The height of the bar represents the number of times an outage of that length was observed. The histogram is stacked, so that the outages on the morning of February 1 (green) begin where the outages on January 31 (blue) end.

Over the 66.5 hours for which I have data, we counted 739 1-second outages. That’s an average of just over eleven 1-second outages per hour, or just slightly more often than one every 6 minutes. The decay of this data is pretty nice: two second outages are approximately half as likely (344, averaging just over 5/hr, or just under every 12 min), three-second outages just a bit less than that, and so on. By the time we get to 8 seconds, we’re looking at only one per hour.

If we look at one 1s-8s outages, i.e. those that on average happen once per hour or more, we have a total of 2018. That’s an average of just over 30 disconnects per hour, or one every two minutes. For once, data proves the subjective experience correct. On a video call, it feels like you get something between a hiccup and a “they last thing I heard you say was…” every couple of minutes.

The right-hand graph is laid out in the same way, but the bars represent minute-long outages. You can just barely see a few counted as 1-minute and 2-minutes in length. Last Thursday, February 4 (red), was the first time we’ve had a significant Starlink outage, long enough for me to spend time poking around trying to figure out if it’s “just us or everyone.”

I’ve been mostly concerned with frequency – how often I can expect outages of each severity. The tool I’ve used to extract the statistics data exposes the outages differently. It is instead concerned with the total amount of downtime observed.

Figure 2: Cumulative downtime, grouped by outage length.

These graphs (Figure 2) are the data as the extraction tool provides it. Each bar represents outages of a certain length, as before. But now the height of the bar represents the total number of seconds of downtime they caused. The 1-second and 2-second bars are now about the same height because there were about half as many 2-second outages as 1-second outages, but they each lasted twice as long. The total amount of downtime they caused is about the same.

That giant red line that has appeared in the right hand graph is eye-catching. Thirty seven and a half minutes of downtime, caused by one 37-minute outage. That 1-minute outage stack is quite a bit taller too, accounting for ten minutes of total downtime itself. This is how the significant outage on Thursday appeared to us. There was a large chunk of time where we obviously had no connection to the internet (37 minutes), surrounded by quite a bit of time where we’d start getting something to download, but then it would stop (ten 1 and 2 minute outages).

The sum of all 1-second-or-longer downtime we experienced in this 66.5 hours of data is 14686 seconds, or just over 4 hours. That’s roughly 94% uptime.

Figure 3: limiting the vertical axis to a count of 50, reveals low-count outage lengths.

We didn’t see the 37-minute outage in the earlier frequency graphs, because it has only happened once. If we zoom in on those graphs (Figure 3), so that most of the 1-13s bars are way off the chart, we can see a few more one-time-only outages. Each day has had some small hiccup in the “long tail” of over twenty seconds. I see hope in the fact that the grey color, which is the most recent data, from the day after the long outage, is nearly absent from the longer-run counts.

I’m curious about the sharp decline between 13 and 14 seconds. Is that a sweet spot for some fault recovery in Starlink’s system, or is it just an aberration in my data? I’ll have to keep collecting to see if it persists.

I’ve posted the summary data I used to generate these graphs in a gist on github.

My Favorite Moment of 2013

It’s the last day of 2013, and I’m supposed to be finishing preparations for a cross-country move. But instead, I really want to recount my favorite moment of this past year.

On Friday, October 11, 2013, MIT’s Hobby Shop held a celebration to commemorate its 75th anniversary. The hobby shop is a place for the MIT community (students, faculty, alumni, and such) to … well, practice *manus* after stretching their *mens*. It’s a large room, filled with benches, power tools, and hand tools for working wood, metal, plastic, etc.

People use the Hobby Shop to build … things. Equipment for lab projects, musical instruments, furniture, signs, or whatever else they might dream. I was (sadly) not a member in college, but joined later to learn and use their large machinery when starting my bed.

The celebration in October included many member projects on display, one of which was a camera. Biyeun, its builder and user, gave a presentation about making and using her creation. In her introduction, she explained her discovery of view cameras and her instantaneous reaction: “I must build that.”

As I nodded my head in understanding of her sentiment, I saw heads all around the room do likewise. Building a machine gives you a different understanding of it that no variety of use ever will. Just a taste of such knowledge can cause everyday objects to practically scream at you forever afterward, “Imagine what it’s like to create me.” I knew that everyone nodding had heard that call.

The dean of student life, Chris Colombo, spoke as well. He was not a member of the Hobby Shop, but had good friends there. He expressed awe for the projects like Biyuen’s camera, that he had seen leave the shop, and a few minutes into his speech said something like, “I wish I knew how to build something like that.” As he took a breath afterward, I could just feel every shop member in the room struggle to restrain themselves from walking onto the stage, grabbing Chris by the elbow, and dragging him to the shop, to teach him how. “C’mon, I’ll show you,” were the words on every lip.

Realizing that I was surrounded by people that not only had wanted to know, and then spent time doing and learning, but now also wanted to show and teach, was my favorite moment in 2013. Finding people that are curious is not terribly hard. Finding those that will follow through on their curiosity can sometimes seem rare. But, finding one who actually wants to share what he or she has learned, by answering the endless naive questions of a beginner, is like winning the lottery. To be standing in a room full of such individuals was overwhelming.

Hobbies -= 1

I shut down a hobby today. BeerRiot, the site I started over six years ago, is now closed. I’m keeping the domain active, because I’ve used the name in other places, but browsers will see only a static archive of what used to be there.

BeerRiot began as an experiment. I wanted to learn about Erlang, and I needed a project to drive my curiosity. It worked, and I learned a good deal about modern web application development in the process. In fact, I learned enough about both that, through blogging about my progress, I was able to join up with a smart team and work in Erlang on web apps professionally.

In fact, even after the experiment paid off, BeerRiot remained my sandbox. New webservers, new storage techniques, new rendering processes, new API designs … I was able to practice with them all in a live setting before attempting to pull an entire team of engineers toward any of them.

So why would I give up my playground? Simply put: I don’t play there any more. My interests have moved on, and it’s time to remove the mental clutter of the service existing (no matter it’s reliability). Were the virtual server some physical object, I’d be putting it on a garage sale. As it is not, I will instead throw a tarball on a backup disk, and laugh when I find it in a few years.

What’s next? On the code side, more focus on that smart team and profession Erlang work I mentioned. On the hobby side … definitely not another web app. I’ll keep this blog up. No promises on changes to its post frequency, but readers will be among the first to know when I find a new thing.

Cheers.

Dev House Boston

If you’re in the Boston area, and interested in Erlang/ErlyWeb, and free next Sunday … I’ll probably be hanging around Dev House Boston.

It’s my first trip to one of these hackathons. I’ve never been to Foo/BarCamp, or any of the others. So, we’ll see how it goes.

My best idea for a project so far is an Emacs mode for ErlTL. But, mainly I’d be interested in helping people come up to speed with Erlang/Erlyweb and/or Facebook app development. I think ErlyWeb’s a great platform for web development, and I’d like to see more people put it through its paces.

I’m also familiar with plenty of other languages/systems, so I feel pretty confident that I’ll be able to hack on whatever comes up.