Google's New Pixel 2 Phones Are Ugly. It Doesn't Matter.

“Oh my God Dad. The Pixel 2 is ugly!” 

I’m told the fruit does not fall far from the tree. My 15-year-old son is a dyed-in-the-wool technology geek and, in certain categories, his knowledge (and sense of techno-style) far surpasses mine. Even after I’ve made a career out of tech journalism for over 25 years. When it comes to smartphones, he’s like an encyclopedia. He can rattle-off specifications of comparable offerings and justify his top choices based on all the technical tradeoffs and costs. A chip off the old block, as they say.

On the way to school, this morning’s conversation was dominated by the announcement of Google’s new Pixel 2 phones. Their capabilties -- the actual utility they have to offer --  much of which is driven by Google's cloud (and the Artificial Intelligence in it), make me want a Pixel 2. But, as that announcement unfolded yesterday, the phone's actual industrial design (see this part of the announcement on YouTube and the images below) was looking rather uninspired. A few newfangled features. Google Assistant springs to life when you squeeze the phone. It’s waterproof (a feature I could have used quite a few times). Otherwise, not necessarily a site for sore eyes.

Google's new Pixel 2 and Pixel 2 Phones Are Ugly

But Google was saving the best for last. As the announcement was wrapping up and journalists waited for a few more breadcrumbs of information, Google dropped the hammer with one of the best kept technology secrets in recent recollection; Pixel Buds. 

These are not your ordinary ear buds. As this part of the announcement on YouTube shows,  they can translate the spoken word on the fly. You speak to me in Japanese, and, in my Pixel Buds, I will hear English (and vice versa, provided the person you’re speaking to is equally equipped). As that technology makes its way into the real world, it will undoubtedly reveal some imperfections and we’ll see the funniest ones on YouTube. But, as the world continues to globalize, Pixel Buds are without question one of the most significant advances in consumer (and business) technology history. 

However, the 15-year-old was unfazed. “That’s cool Dad. But that phone is still ugly.”  

Maybe so. But in the even bigger picture, Pixel Buds are emblematic of something else to keep in mind — something that speaks to Google’s mindset which is exceedingly different than that of any other of its competitors in the market. It’s the antithesis of everything that Apple has taught us about how much style matters when it comes to our technology.

We are now heading into an era when the phone can be as ugly as sin, and it just plain won't matter. 

Why? Because we’re moving to a future where the majority of the experiences that are driven by the technology currently found inside our phones will be experienced through something else other than the phone. In other words, the phone, or what’s eventually left of it, will be reduced to a wireless communications and compute hub. Once it's a hub that never leaves your pocket, purse or backpack, who cares what it looks like?

We won’t need the keyboard or other buttons to tap because it will all be done with voice and other gestures (ie: the nod of a head or the blink of an eye). We will experience the visuals including augmented and virtual reality through glasses. Or we'll "cast" the visuals to some other device like a high-def TV (as in Google's "Chromecast").  If GoPro cameras with their voice activation and remote viewfinder capability prove anything, it’s that we won’t need to take our phones out to take a picture. And of course, any audio — what you say or what you hear — will happen through something like Pixel Buds, as they too, currently prove. 

APIs will live at the core of all these interactions. The phone, if you want to call it that, will handle wireless routing between your peripherals and the Internet and will of course, do a ton of the compute-heavy work on behalf of those peripherals. That includes acting like an API gateway for the applications that need it. The model already exists.

Own a Fitbit? It comes with an API and that API is actually up in the cloud. It’s not on the Fitbit itself. Yes, the specific Endpoint for your Fitbit is in the cloud. How does data about the steps you’re taking and the sleep you’re getting move between your Fitbit and that cloud-based endpoint? It’s routed through a communications hub. Your phone.

My friend Fred Davis, who is an investor, pointed out that Google is making Apple look lame. His post was actually my inspiration for this column

One reason that Android is in position for the win is the Microsoft Windows-like business model for Android whereby anyone can make an interesting hardware play as the phone transitions into a communication and compute hub. In short, the Android ecosystem is open to all kinds of such hubs targeting all sorts of vertical applications. A hint of this world was brought to my attention earlier this week when I interviewed TRX Systems CEO and president Carol Politi.

TRX Systems offers the real world version of the 3D indoor mapping visualization that makes it possible to track the fictitious special agent Jack Bauer as he crawls through an air conditioning duct in the TV series 24 (receiving guidance for each an every move from the folks back at the counter-terroism headquarters who are also watching him). TRX's customers are first responder organizations like police and fire departments whose commanders need to keep track of their "assets" in real-time as they enter dangerous and potentially lethal situations like last week's mass shooting from the Mandalay Bay Hotel & Casino in Las Vegas. 

It's very cool stuff. But, as Politi points out, it only runs on Android. Why? Because first responders need something a bit more rugged than a beautifully styled iPhone. Unfortuanately, the iPhone ecosystem isn't open to third parties like Sonim that want to build those ruggedized phones. According to Politi however, TRX also works with other Android-based solutions that, to me, hint at the smartphone's pending deconstruction.

"We deliver our system in two configurations. One is on an Android device. Typically in the public safety community, that would be a kind of hardened Android device, like [ones] delivered by Sonim, for example." said Politi. "And the other is on an Android device, plus an accessory device, a wearable accessory. That wearable accessory is body mounted. It might be embedded in your vest and it's going to provide higher accuracy when you go inside of an environment that you've never been in before, because we're modeling human motion at the core, so if it's mounted very close to your body and not moving, we're going to give that first responder more accuracy."

If you're reading the tea leaves as I do, it's only a matter of time before companies start delivering turnkey solutions to industries like first responders. The fireman's coat and helmet, for example, will just come pre-equipped with all the ruggedized cameras, sensors, visors (providing the visuals) and ear buds. And, what's left of the smartphone -- the hub -- is all that the first responder will have to add to tie it all together. Giving new definition to an old acronym (the PAN or Personal Area Network), the less vertical consumer version of that will be hanging on the clothes rack at your local department store. 

It may take a few years for this future to arrive which is why an ugly phone right now seems like a really bad idea. But Google could care less. It has always played the long game. Remember Google Glass? Everyone viewed that project as a complete failure. Hardly. The learnings from Google Glass will be showing up in future offerings for decades to come. The company knows that at some point, Function will win over form. So, that's where it's placing its bets.

Be sure to read the next Mobile article: Microsoft: Apps Killed Windows Phone