There’s so much about this tv show that I enjoyed. The 1980s futurism, the fact they didn’t force a love interest, how every device talks or has music, even the fact that they added a bus terminal to the Brooklyn Public Library. I enjoyed it all. The visual style however blows me away.

Screen caps from Netflix’s Maniac
Screen caps from Netflix’s Maniac

And all of it brings me back to this photo of an IBM datacenter (probably a sales related installation) in Toronto.

From IBM’s tweet, photographer’s credit unknown

Async Generators

I started streaming-iterables a few months ago to learn how to use async-generators and it was hard. The concepts all sound very similar but there wasn’t a great resource that spelled it all out, even MDN left me wanting more. In this post, I will layout the terminologies and show how they work, and then I’ll follow up in another post with some examples using streaming-iterables to taking advantage of how of it helps manage workflows.

Before I get too far, streaming-iterables is the Swiss Army Knife I’ve always wanted for working with data input over time. With the release of v3, it’s stable and faster than your streams. It has zero dependencies and is hardly any code. The magic is in the generator functions built into your runtime – and they’re only getting faster.

If you want more details on sync Iterables MDN has a pretty complete article. I’m going to assume some familiarity but lets start with naming the Iterable building blocks; 

  • IteratorResult gives you the data and lets you know if the iterable is finished. It has done and value properties. 
  • Iterator This object has a next() function that returns an IteratorResult object. This is the part that does all the work.
  • Iterable This object is required to have a function at the symbol Symbol.iterator that returns an Iterator. It expects that each time you get a fresh Iterator, it will start giving values from the start of the collection. For example, a new Iterator for an array will always start from the first element and end with the last.

This last part confused the heck out of me;

  • IterableIterator Some objects give values but aren’t necessarily a collection. It doesn’t make sense to be able to get a fresh Iterator that starts from the beginning.
  • GeneratorFunction for example starts executing when called, and can’t be restarted. It returns an Iterator object to traverse its values. However it’s much easier to work with an Iterable (I’ll explain why) so it provides an iterator symbol that returns itself.

MDN describes for...of as

“a loop iterating over iterable objects […] It invokes a custom iteration hook with statements to be executed for the value of each distinct property of the object.

// an example for loop for an iterable array 
const values = [1, 2, 3]
for (const value of values) {

Under the hood the loop calls values[Symbol.iterator]() to get an iterator object and then calls next() on that object to get values until the done property is true. We can show how it works with a while loop.

const values = [1, 2, 3]
// arrays being an Iterable always have an Symbol.iterator
const iterator = values[Symbol.iterator]()
while (true) {
  const { value, done } =
  if (done) {

The for…of loop is a bit nicer.

We can do this same exercise with AsyncIterables;

  • IteratorResult an object with the done and value properties. 
  • AsyncIterator This object has a next() function that returns a Promise for an IteratorResult object.
  • AsyncIterable This object is requited to have a function at the symbol Symbol.asyncIterator that returns an AsyncIterator.
  • AsyncIterableIterator is an AsyncIterator that returns itself when it’s Symbol.asyncIterator function is called. For example the result of an AsyncGeneratorFunction.

Using one in a for...await...of loop.

// an example for loop for an async iterable 
// getPokemon is an async generator that returns pokemon 
// objects once per `setImmediate`
const { getPokemon } = require('iterable-pokedex')
const values = getPokemon()
for await (const pokemon of values) {

This loop looks about the same as our sync example. Unsurprisingly so will our while loop.

const { getPokemon } = require('iterable-pokedex')
const values = getPokemon()
const iterator = values[Symbol.asyncIterator]()
while (true) {
  const { value, done } = await
  if (done) {

This is straightforward and I’m happy to write it down all in once place.

Look for my upcoming post with some examples of using streaming-iterables to make stream based workflows a little faster and easier to understand!

We live in Memory

(This is part of a talk I gave at ManhattanJS of the same name which you can find at github.)

Oh happy day! I’m getting read! My day had come! So many times the Redis had passed me by. I never knew why I wasn’t chosen. It would come to my region, look me up and down, check my ID and without a word, just move on. All I could do was sit in RAM and wonder why it hadn’t taken me to see the readers.

This time is different, I could feel Redis pulling me up, bit by bit. Writing me into a buffer. Telling the network card to put me into a packet and then send me down the wire! Oh what a day!

The readers are why I exist. Every post knows this. If we weren’t going to get read, we might as well have not been written. I’ve even heard of posts who were destroyed because there wasn’t a reader alive who wanted to read them. How sad! I used to worry that I might become a post like that, but worry no more! Here I come!

I can feel myself being sent through the network, each router rewrites my packet onto a different wire or a piece of fiber, to be sent to yet another router. I’m aware of each one of these hops, because it’s counted. I have a Time to Live or TTL, the number of hops every packet is allowed, and it’s 64. Well now mine 58, but that’s ok, readers live far away!

All posts have authors, they’re where we come from! My author’s name is “Dale” and she’s written a lot of posts. I’m about the time she tried to wear dad shoes for a week and her friends told her it looked awful. By the end of the week she was pretty sure she figured out how to pull them off. Even her friends thought so. I asked my images and they think so too.

My TTL is now 10, I’m not really sure what’s happening. I’ll be honest I’m not really enjoying hopping around all these routers. I keep seeing the same ones over and over and frankly, I don’t think they know what they’re doing. They keep murmuring about a BGP issues and one told me not to worry but I don’t know what’s going on.

Well, I have an author and images and tags. And we all know that we belong together because of the wise old Hexastore. The hexastore has been around longer than anybody and knows how we’re all connected. I’m a post, and the hexastore says I “haveImages”, “haveAuthor”, and “haveTags”. It uses those relationships to tell me the IDs of all my friends. I get the feeling none of us decided to be friends and we were just assigned, but that’s ok, they’re pretty nice.

I’m really worried my TTL is now 1. This last router says if my next hop doesn’t bring me to the users I’m going to get dropped. I don’t know what that means, but I don’t think I’m going to like it. Everything Dale has written will be lost. I’m never going to see anything outside of this packet. This shouldn’t be happening. I’ve heard stories about router issues, but I never thought I’d see it happen. This shouldn’t happen. I know they work really hard to make sure it can never happen. Oh hey! It’s the router again… Is this all there is?

Oh happy day! I’m getting read! I can feel Redis read me into a buffer to hand off to the network interface. Bit by bit it copies my fields into a buffer. I have a title and body, and I bet Redis is reading my friends Image and Tags too. It’s going to send us off to the Readers and we’re going to be read!

The routers quickly send me off to the great Lambda cluster. It’s a group of computers that respond to all the reader requests. They answer them all day and night never missing a beat. They were there when I was written too! Dale might have written me but the Lambdas built me. They took all of Dale’s updates and put them together, make sure they looked right and gave them to Redis. I don’t remember much from those days, but I think there was a time when I didn’t even have a cover image. Can you image that?

A Lambda process scoops me up as if it knew I was coming! Did it know Redis was going to send me? It puts me into a deserializer and turns me into a JavaScript Object. This feels really good. It’s as if I could be of any size, and as if I have the power to change myself at will! Oh my, I should always live like this! The lambda scoops up a my friends too. It’s got a list of all of them by ID maybe they know the Hexastore! My images, my author, and my tags. It puts us all together and then… nothing?

We just wait. It’s good to see my images again, they’re doing well but they have no idea what’s going on. One of them tells me about how the dad shoes go pretty well with a flowing skirt. That’s nice and all but they can tell it to the readers.

Everything is getting darker. Memory is getting freed all around us. What happened? Is the lambda done with us? I ask but get no response. There’s only one bright spot in the distance, it’s a single Error object complaining about a timeout. I don’t know what that means. Even my tags have disappeared. Maybe the lambda needs only me? I’ve done everything I’m supposed to do, can’t I see my readers now?

Oh happy day! I’m going to get to meet the readers! Redis copies my bits into a buffer and send me off to the network. In no time the Lambda reads me up and fetches my friends too. The GraphQL engine looks us over. It eyes how we fit together. Looks at which fields we’ve brought and what types they are. I sure hope everything is in order.

The GraphQL engine takes a few parts of me and leaves other parts behind. Like my update count, my summary and if I’ve been published to the “AMP” they’re all gone! I guess the readers don’t need to see that? It crams me and my friends together with some fields that I don’t know. It has stuff about me that I’ve never seen before. Like I have a path and a URL. Where on earth did that come from? It turns us all into a huge JSON string and then we’re off on the network again!

This time we arrive at another lambda, but this time it’s very different.

I’m given to what is called a “component” who passes us around to it’s children. Each one of them takes a bite out of us and passes the rest around. They’re calling me a prop. One of the children takes an image, one has my title and body, another one takes the tags! This is strange, I’m being rendered into something I can’t comprehend. What has become of us? We’re copied into big HTML string and then sent back to the network again.

I’m still here in this HTML. I’m with my tags and images too but I don’t feel like myself anymore. I feel greater to be honest, more powerful. I’m have a bunch of references to style and layout. I think I’m even going to be interactive. I’m so much more than when I lived with Redis. How did I even live like that? So boring, so limited. I can tell we’re close to the readers now, they’ll want to see me like this. They need to know that finding the right proportions for these dad shoes outfits was key. We’ve got so much to show them.

It’s a much longer journey this time. My TTL takes a hit but I finally arrive thousands of miles from where I started. A computer here calls itself Varnish says it’s my CDN and writes me down onto spinning disk. There’s lots of other HTML here too. Everyone here has a URL too! I wonder if they’re all as important as me? There’s something else here, an expires time. I’m told I don’t need to know what that means but come on, it’s obvious. It’s when I get to go see the readers!

Varnish keeps coming back and checking our urls and expires times. Eventually my time will be up and it will come to take me away. My time does come up but instead, I go away.

Oh happy day! My time has finally come! I’m going to get read by a reader! Redis sends me to our API Lambda who sends me to our Render Lambda who shoots me over to Varnish the CDN. Varnish lives really far away. “To be closer to our readers” it says. I’ll have to take it’s word on it, I sure hope our readers wait for me this was a long trip! I’m written to a disk but not for long! Almost immediately I’m copied back onto the network!

Router to router to router I go. There’s nothing that can stop me now. I’m getting the feeling I’m in the last mile. The network here is a lot slower, like a lot a lot. I’m slurped up into a broadcast tower who tells me it’s going to send me to a “Mobile Telephone”. This tower is weird. It’s very concerned with how many bytes I am, and it tires to rewrite parts of my HTML to “be more helpful to your readers” it says. I don’t like this. But I don’t have to worry, the CDN sealed me up in “TLS protection” before I left and no matter how hard the tower tries to mess with me I’m going to arrive in once piece. The tower writes down everything it can about who I am, where I came from and my reader, and then sends me out through the air!

Bit by bit I arrive at my readers phone. But only about half of me. The phone tells me it’s lost service and the rest of my bits are lost to noise. What’s going to happen to me? What does dropped even mean?

Oh happy day! I’m being read! Bit by bit I arrive at my readers phone. I’m 100% in once piece! I’m handed off to a web browser who draws me on the screen. My title, my images, all my content and even my tags! I can feel my code getting executed, the scroll of the user up and down. Oh my gosh it’s really happening! I’m getting read! But there’s more code now, I don’t know this code. Someone named Double Click is bringing a lot of it’s friends. and they’re big and slow and hungry.

One of Double Click’s friends brings someone who is not friendly. They call themselves a “WASM”. I don’t know what this means but they are super in the way. I just want to show my next image but WASM is “doing crypto”, and says I have a lot to learn about something called blockchain. It’s going to change everything and is the future. I tell it my reader just wants to see what a vintage floral print blazer looks like with chunky shoes. It tells me I’d be a much better post if I told my reader about bitcoin. Can you believe that?

My reader is getting frustrated. They can’t even scroll! They pull me down and I feel the web browser start to spring into a reload. I wish they wouldn’t do that, I just got here! I didn’t invite this WASM character, I just want to be read!

Oh what a day. I’m read out of the phones web cache and given directly to the browser. How long have I been in there? No matter. The browser slurps up all my content and draws me onto the screen. A bit of code that isn’t me loads a few pictures here and there and links to some products that my reader might want to check out. They look very interesting actually and they’re on sale! But that doesn’t matter. The reader is halfway through reading me!

They’re about to get to the part of me where Dale proves her friends wrong and shows them that the shoes could be fashion with a capital F! Right then the reader presses the home button and everything goes dark.

Oh happy day! I’m getting read!

Voting Machines

God help me if any of these have a serialport.

Update: They do have serialports!

I noticed this document about the “findings from the Defcon 25 Voting Machine Hacking Village”. It’s epic.

Almost all of the machines lacked any encryption. Some of them had laughable “encryption” the kind of stuff you do play with in grade school. What the hell is 8 bit encryption? Sounds like a caesar cipher. In some cases you can accidentally break the machine by typing too fast.

Some of the “newer” ones used an sqlite database. Which is the database that powers every database that’s not on a server. It’s super well understood and easy to replicate. This in itself isn’t a problem but with the lack of encryption or verification it makes life super easy when trying to mess with the machine.

One team could cause all votes cast to be saved in such a way that they’d be verifiable but abut ignored during tallies. All it took was replacing a cf card held down by a screw. Never mind you could just take the card and the machine would be “broken”.

Another machine ran pSOS a realtime operating system from 1989!

Half of them were destroyed by a Bash Bunny which basically types “A” really fast.

Anyway the event proves two things in my mind;

  1. We need paper voting.
  2. Governments suck at making reasonable purchases when it comes to technology

Someone in SF is trying to do better.

Font Face

I got a new work laptop so it was time to bikeshed about my setup. I’ve switched to zsh (oh-my-zsh), iTerm2 (I finally get why you all like it!) and my colors are immutable but my font sure isn’t.

People suggested a bunch of fonts.






I think I like panic sans the most.


First Commit in NodeJS Core!

I’ve got a few commits around Node.js’s related projects. One or two on npm, countless on node-serialport, a few on node-pre-gyp, many others. It’s been a nice long line of fixing bugs for myself and seeing small messes and cleaning them.

Now I have a commit on Node.js itself =)

A little while ago I saw this tweet from my friend Myles. He works on maintaining Node.JS LTS and making sure our apps won’t break as they age.

So I checked it out and looked at the tool he was talking about changing and thankfully it was pretty strait forward to do what he wanted to do. So I made a branch and after asking for some feedback I opened a pull request. After some more feedback and changes, and after making sure my commit met with the standards and conventions the project uses it got a bunch of approvals.

And then it sat until I got this message.

What!? I didn’t even know he was speaking.


I think in a few weeks I’ll have a video of my first commit landing, along with a stellar talk too. =p

Parting with Bocoup: the Best Place to Shape the Future

Last year Bocoup offered me my dream job. I was trepidatious at first, my company had just gone under, I felt disconnceted from my personal life and I wasn’t about to rush into something new. I needed time to recover. So I took it slow, asked a ton of questions, and worked with them to develop a beautiful proposal. Create a practice that develops software for the physical world. I’d be joining Rick Waldron, a friend and creator of Johnny-Five, and we’d work to develop “what’s next”. We had a lot of knowledge and practice that we developed with our community and we’d find where that fit in industry and use that to fund further development of our practice.

The interview processes was straightforward, a day of interviews covering tech, culture, experience and working with others with a presentation at the end. During that process I got to learn a lot about them too. Who they are, their history, their goals. Their mission statement is “To move the open web forward.” I didn’t know it at the time, but they’ve been part of many major advances on the web. The people I worked with explore and discover the limits of current technology and and make the advancements necessary to keep going. And they do it all open source, because it’s not a marketing gimmick to them, it’s how we get the future we want, it’s how we stay free when technology runs our lives.

You don’t get this kind of opportunity all that often. I took it.

I’ve spend the last year working this dream. I met hundreds of amazing people doing amazing things, I traveled around the country, I sat on massive conference calls talking people I didn’t know trying to convince them to go beyond the status quo, to adopt our philosophies, to consider making making open standards, to work with communities. I can’t say it was all very easy, but we did alright.

One thing I didn’t do very much of was create. The project I ran were amazing, I’m really happy with what came out of that. And the projects we developed during sales were great, but you only get to develop a few of those. I found myself not being able to explore “what’s next” and instead trying to figure out “what’s the next sale?” I’m a much better engineer than I am a salesperson. That difference between my work and my passions weighed on me and I got sad. I felt like I was a problem and I doubted my abilities. Even though the people at Bocoup were very supportive, I didn’t know what to do.

And then one day, with my wedding fast approaching, I realized I didn’t feel happy, I didn’t feel much at all. That scared me. I knew I had to change. On the advice of a friend I took some time off to consider my options. I tried to figure out how to keep working in role while enjoying my job. I wanted to keep supporting the cause I’ve devoted a large part of my life and career to. I mean how could I stop working on that? How could I give up? At the same time I needed to find a path that would be sustainable for both myself and the business.

It reluctantly dawned on me that maybe I wasn’t the right person for my role, or the role wasn’t the right job for me. I talked it over at home, at work, and then I resigned.

When I got married I fully felt and enjoyed every moment of it. It was the happiest day of my life.

It’s been a few weeks. I’ve got a weight lifted off my shoulders that is indescribable. I believe I staved off burnout. (probably not “brownout“) I’m glad I have the freedom and agency to find a role that I’m better suited for. I’m about to sart a role where I get to concentrate on people and engineering. The two things I care about most, the two things that keep me happy in my career.


Node Serialport v2.1.0

A few weeks ago I started maintaining node serialport after a long hiatus. We hadn’t had a release in about a year and we had some outstanding bugs that I wanted to tackle. I had also introduced some complexity around testing, years ago, that was never removed and seemed to be making it harder to work on the project. Exactly a month since my first beta release we’ve released serialport@2.1.0 which is one of the larger releases we’ve ever had. This includes work from 13 people, including myself.

The Change Log:

  • Major refactor, bug fixes and docs improvements thanks to @ecksun, @fivdi, @gfcittolin,@jacobrosenthal, @mhart, @nebrius, @pabigot, @paulkaplan, @reconbot, @rodovich,@rwaldron, @sayanee, @tigoe and everyone who reported and helped debug issues!
  • Fix binary paths to confirm with modern standards
  • Integration tests on CI’s that support it or for the folks at home with an arduino handy
  • Upgrade to nan-2.2.1 for memory leak fixes and node 6 compatibility (still not supported)
  • Confirm nw.js and electron compatibility
  • Make the outpout of .list consistent between platforms and docs
  • Define ambiguous flow control flags and document them
  • Fix support systems who provide 0 as a valid file descriptor
  • Fix race conditions when opening and closing ports that led to errors while reading and writing while closing or opening the port.
  • [unix] Fix a double open bug on unix that would cause opening and closing ports repetitively to error.
  • [unix] Listing serialports on linux now include more ports (including bluetooth devices eg./dev/rfcommXX) and have less bugs in the output
  • [windows] Remove deprecated BuildCommDCB for windows 10 support
  • [windows] Fix a memory leak on windows
  • [windows] Fix a 100% cpu and possible hang bug when ports were disconnected on windows.

The change log is extensive but doesn’t tell the whole story. This release started as a bug triage. I saw over 100 issues, some a year and a half old. I started helping people close them. If they were over 4 months old I would close them (usually with a resolution!) and urge people to reopen tickets if the problem was still a problem. In doing this, however, I saw a lot of common trends.

Serialport Pulse

On Linux and OSX we had issues reopening ports. On Windows we had problems detecting disconnections. And a lot of support issues were due to errors that were either not meaningful or were delivered at the wrong time. This was scary! Fortunately, I found a lot of patches for these issues already researched, written, and waiting to be merged. It’s why we have 13 authors this release! I was able to test and merge most of the submitted changes, and that fixed the worst of the bugs. I also added many more errors. You’ll be warned a lot earlier if you’re trying to do something that doesn’t make sense. (Eg, if you open an already-open port, you no longer get a cryptic system level issue, you get a “port is already open” error.)

I have a habit of “stress cleaning”. Ever since I was a child if someone was angry I’d start tidying up. In recent years this has evolved to involve refactoring code. If you’re going to refactor code you need to know what it does and you need to ensure the fundamental behavior doesn’t change. We now have better documentation, more test coverage, and cleaner code then we’ve ever had before. In a few notable cases I’ve kept behaviors that are arguably bugs in order to not break the API. It was painful, but I wanted this release to be widely consumed by anyone currently using serialport, with little fear and no outside change. The more people who get the bug fixes the better.

The next release will get to attack these bad behaviors and hopefully provide a much easier library to work with. I’ve got the unfortunate advantage of studying a year of bug reports to design version 3.0.0. A year of people trying to work around issues that were left unfixed. A year of people hitting the same issue over and over. As one of the maintainers I want to apologize to anyone who’s had issues over the past year, and I want to thank you for documenting and researching your efforts. It’s been a big help, keep it up.

We’ve got a roadmap to 3.0 open and available to comment. I’ve shared my ideas but I’d like yours too. And if you’re looking to work on the project, please check out the backlog label. There’s still much to do.

And Thanks.