The Open Interconnect Consortium, a group of big tech companies including Intel, Samsung, and Dell, announced last week their intention to create and open-source a specification for connecting the billions of things that make up the Internet of Things. This is the sole purpose of this consortium, and they’re starting with smart homes, smart offices, and smart cars.

Their mission is to define “the specification, certification, and branding to deliver reliable interoperability” between all wireless-enabled devices across all operating systems: Windows, Linux, Android, iOS, and so on.

In other words, if your smart watch and your smart television can seamlessly talk to one another via this platform, they’ll be certified and branded as such. Concepts like user identity, authentication, proximity, onboarding and provisioning, and of course communication, would be plug-and-play between any two certified devices.

As you can imagine, this announcement went out without a ton of fanfare. In comparison to March headlines touting a bot breaking the story of an earthquake, this was a blip.

I can see why. This isn’t layperson news. This is rocket science, or rather, robot science. And unless you’re talking about the Terminator or Robocop or some other machine putting a human out of a job (and/or eventually killing them), people tend to want to get back to enjoying their slow news day.

But this story has much bigger implications for automated content than the template-driven ramblings of an earthquake sensor.

Over my last four years at Automated Insights, we’ve evolved automated content from its Mail Merge origins into a fully algorithmic code-churning platform, capable of cranking out professional-sounding, insight-packed articles at the rate of up to 1600 per second.

We’re doing this for the AP and Yahoo, among others, and studies have shown that our automated content is not only indistinguishable from human-authored content, but in most cases it’s viewed as more trustworthy. So I can safely say that the technology to translate the binary code of machines into plain English is already mature and robust.

If the Robot Writer is already fully-baked, the Robot Reporter is on the horizon, represented by all these devices and sensors tracking and measuring things they previously could not. These sensors are becoming ubiquitous, and not just in sports and finance and traffic and personal fitness, but nearly every walk of life.

The standards that are being proposed by the Open Interconnect Consortium could be perceived as a stepping stone to the Robot Language. As all these devices chat with each other, they’ll do so with a common communication paradigm and an inherent understanding of one another. This will explode the new science of automated content by creating countless new verified sources.

But while the goal of a universal language might be noble, it’s also somewhat flawed.

From the Old Testament story of the Tower of Babel to the century-old experiment with Esperanto, which today is only remembered for the William Shatner horror vehicle Incubus, to the yet-to-be fully adopted metric system (although to be fair, it’s really just America, and we’re not budging on this), the quest for a common language has always been seen as a path towards efficiency and, thus, innovation.

But that quest has always been sandbagged by cautionary tales and a lackluster adoption rate.

Standards, on the other hand, are a much more accepted way of creating, and more importantly, sustaining, the uniformity necessary to get many hands working toward the same outcome.

In technology, standards are crucial. Philosophically, computer languages are really just translation engines from human to machine, English to Binary. And once you get enough software and hardware components working together with at least some compatibility, there’s a lot less reinventing of the wheel, freeing up time for innovation.

These days we have development and compatibility standards for PCs, mobile phones, operating systems, and various applications and application types (i.e. web browsers). Yet even those standards are splintered among various players (Apple vs. Microsoft) and even across devices (iPhone vs. iPad).

Game developers, for example. have struggled with the lowest common denominator for decades now. If they want to develop a game that works over several platforms, they’re handcuffed to the abilities of the lowest quality platform.

But it’s important to note that few if any of those standards are universal. It should also be noted that there is another Internet of Things standardization effort, the AllSeen Alliance, which includes companies like Qualcomm, Microsoft, and LG, and it’s basically trying to achieve the same (but different) goal. Anyone who remembers HD-DVD vs. Blu-Ray or the People’s Front of Judea vs. the Judean People’s Front has an idea of how this might turn out.

And it probably isn’t good for the people, Judean or otherwise. Furthermore, Google and Apple are pretty much still out there on their own. At this point, a standards initiative probably seems like a net negative to them.

Look, I applaud the effort to standardize the Internet of Things, and if it gets done, it will be huge not just for automated content, but for anyone who can benefit from a smart anything. I just hope we’ll remember the lessons learned from standardization efforts past, and make sure we’re doing this for the data, not for the branding.