It’s been a couple of weeks since we recorded our 400th show and on MobileTechRoundup show #401 we picked a couple of winners from our giveaway celebration.
We wanted to thank the great folks at HTC for offering up an HTC U11. The winner, Arthur, selected the new Solar Red HTC U11.
Adam won the black LG G6 and it will be on its way tomorrow.
Running time: 64 minutes
Listen here (MP3, 73MB)
You might not believe Apple’s claim that the iPad Pro can replace a PC, but the newest models are bumping up against Apple’s own top-specced MacBook Pro in performance benchmark tests.
The CPU and GPU tests were run by Mac-focused benchmarking blog Bare Feats, which pitted the new 12.9-inch iPad Pro against the highest configuration 2017 MacBook Pro 13-inch, with Intel’s Kaby Lake 3.5GHz i7 processor, Iris Plus Graphics 650 GPU, 16GB RAM, and 1TB flash storage.
To give a fuller picture of the iPad Pro’s generational improvements relative to the MacBook Pro, it also tested the 2015 and 2016 iPad Pro models, and the 2016 MacBook Pro retina 13-inch.
Specs for each device in the tests included:
On the single-core CPU GeekBench 4 test, the 2017 MacBook Pro scores 4,650, just ahead of the 2016 MacBook Pro. Both outperform the 2017 iPad Pro models, but not drastically. The 12.9-inch and 10.5-inch model score just under 4,000, well ahead of 2015 and 2016 models scores around the 3,000 mark.
The difference between the older iPad Pro models and the newer ones is more pronounced in the multi-core CPU test. The 2017 MacBook Pro leads with a score of 10,261, not far ahead of 2017 iPad Pros, which both score above 9,200, and edge out the 2016 MacBook Pro’s score of 8,500.
In the GPU tests the new iPad Pro leads the pack. Both 2017 iPad Pro models come out just ahead of the 2016 and 2017 MacBook Pros in the metal GPU compute test, with scores nearly double 2015 and 2016 iPad Pro models.
None of these results will convince anyone that a locked-down iPad without mouse support can replace a PC. But it is interesting, as Bare Feats notes, that you can get an iPad Pro with some laptop capabilities and comparable performance at a third of the cost of the best MacBook Pro.
Apple demos iOS 11 on the new iPad Pro
It’s early days, but it seems that iOS 11 has a big problem when it comes to usability and discoverability of new features.
The WWDC 2017 keynote gave us the first proper look at the upcoming iOS 11, and it’s clear that Apple has been hard at work adding a whole raft of new features to the platform to help keep up with Android.
Just this slide from the WWDC keynote shows just how much new stuff is going to land with iOS 11.
But anyone who has followed operating system for any time knows that adding more features brings with it problems. Specifically, three problems arise:
I’m constantly amazed how many people I come across who haven’t found that they even have features such as Live Photo or 3D Touch or that feature that allows you to send drawings in Messages (but only if you turn the iPhone into landscape orientation), and yet these are fast becoming core features of the platform. I mean, people have paid good money for a piece of kit, and are only using a fraction of the features that it has to offer.
I’ve seen this many times before, especially with monolithic products such as Microsoft Windows or Office, and down the line it tends to lead people to think that there’s no value in upgrading.
I’ve also noticed how Apple is increasingly having to rely on iconography that isn’t all that clear, and only seems to turn to words when the designer has clearly given up (notice how out of place “Screen Mirroring” looks on the iOS 11 Control Center, and the crazy amount of space that it takes up compared to everything else, especially given how niche the feature actually is).
Then you have things like this in the Messages app (see below). While I know what these buttons do (from pressing them), the icons are as clear as mud.
Even the redesigned App Drawer in iOS 11 is still a mystery meat stew of icons and buttons.
Another good example I’ve touched on in the past is the Settings app.
Oh boy, is that a steaming mess.
Apple’s bolted more and more stuff into the Settings app over the years that it has grown to resemble the Control Panel in Windows. It’s a horrible old carpet that Apple brushes countless design sins beneath. And it feels like Apple’s “solution” to the problem of finding what you want in the Settings app has been the same as Microsoft’s — throw in a search feature, and add another layer on top of the mess and hope no one notices (Control Center in iOS 10 is much the same “solution” to the problem as the Windows Settings is in Windows 10 — a place to put some of the most commonly used features that floats above the bilge of all the legacy the platform has built up).
I think that in part Apple is deliberately trading discoverability for complexity. Hiding a feature behind a finger swipe or a 3D Touch press means there’s less on-screen clutter to contend with, and if people don’t know a feature is there, well, at least it’s not getting in their way. But the flipside is that people overlook the feature, or the feature is clumsy or cumbersome to use.
Now, I’ve no doubt that the tech press will work hard to raise awareness of all the new features in iOS 11, but there are plenty of people out there who have no interest in the technology they own beyond using it, and will never come across this information.
It feels that increasingly Apple is letting those people down.
It’s almost as if Apple has lost sight of its users, believing that everyone is either a tech nut, fanboy, or developer.
Now as a rule I wouldn’t be commenting on an operating system that’s as early in the development cycle as iOS 11 is. Developer previews are meant to get the platform out to coders in a timely fashion so they can start using the new features, and not something designed for public consumption. But I’m making an exception here for two reasons.
First, this isn’t a new problem, but instead one that iOS has been increasingly suffering from as Apple adds more features to satisfy consumer demand and stay ahead of Android. I’ve watched this problem creep through iOS over the past few years.
iOS 10 was bad. iOS 11 is looking to be worse.
And secondly, by the time it hits the developer preview stage, iOS is already well ahead in the development cycle. The public beta expected next month, and the final release out in September. If Apple were planning a huge redesign in iOS 11, we’d be seeing evidence of that in the developer preview.
And we’re not seeing a sign of that.
There’s still time for Apple to unveil a refreshed, revamped, and improved user interface for iOS 11, but with only about ten weeks before the release date, I doubt it.
Bottom line, we’re all to blame for this mess. On the one hand we want things to be simple and easy to use, but on the other we want to be able to have every conceivable feature crammed into every device. Part of the problem is that the iPhone and iPad, with their vastly differing screen sizes, share what is essentially the same (albeit radically forked) operating systems.
What works on one screen size doesn’t on another.
I remember the same thing happening with Windows when Netbooks became a thing. An operating system that worked well on a desktop or notebook looked cramped on the small screen, with cursor tolerances becoming problematic, and user interfaces shrinking to the point of being unusable.
I suppose we’ll have to wait for iOS 12 to see if Apple has any ideas as to how to dig itself out of the usability hole it’s dug for itself.
Kevin and I are pleased that both LG and HTC volunteered to sponsor a big giveaway to help us celebrate another milestone with MobileTechRoundup show #400.
We wanted to thank the great folks at HTC and LG for offering up a brilliant HTC U11 and a solid big screen LG G6 to two luckly listeners. Check out the show notes on the MobileTechRoundup podcast and listen to the podcast to find out how to enter over the next two weeks. You can also check out my thoughts on these two devices in my HTC U11 review and LG G6 full review.
Running time: 64 minutes
Listen here (MP3, 70MB)
Yesterday’s WWDC keynote made me nostalgic for the Steve Jobs days, with his laid back style and calming ‘Reality Distortion Field.’
Keynotes of past were tranquil. Peaceful. Even Zen. The excitement was limited to a new product or feature, and the event was carefully paced so that everyone had time to absorb the information, and take a breath before moving on.
We were teased with new products and given more than we expected, but not so much as to feel overwhelmed or overcome.
Pricing was discussed, but in a low-key, discreet, and tasteful way.
But not anymore.
Yesterday’s keynote was less Zen and more a Marvel or Star Wars movie. There were spaceships and laser blasts and TIE fighters screaming across the stage, volcanoes and lava, and even Darth Vader made an appearance.
And the pacing was breakneck, reminiscent of a Michael Bay Transformers movie, with everything coming at you at a furious, non-stop pace. There wasn’t a moment to catch your breath and take in what you’d seen a second ago before a new presenter was flinging more stuff at you.
I think the only time I relaxed during the entire keynote was during the five seconds it took me to down a Red Bull.
But it wasn’t just the style that was different; the content was also noticeably altered.
It used to be that Apple didn’t unveil products during the WWDC keynote. WWDC is a developer conference; as such, Apple has focused on things that are of interest to those building apps for its various platforms. Sure, there are plenty of examples where the presentation was kicked into the weeds, but these were the exception, not the norm.
But not anymore.
Now the keynote is the place to hit developers over the head with new products. Product update after product update was unveiled, along with several new products, even products that had no direct relationship with developers (such as the HomePod). Now Apple can save on the expense and effort of having launch events for products by cramming them into the WWDC keynote.
I wonder how much of this is driven by the fact that there’s really not enough that’s new and exciting about Apple’s products to demand a launch event. Even new products such as the iMac Pro and HomePod were done to death after about 10 minutes.
It was crass and tasteless and so unlike past keynotes that I’m led to the conclusion that Apple is so worried about hardware sales that it transformed the keynote into a cringeworthy sales event.
If this is the reality of an Apple event from now on, it makes me long for the ‘Reality Distortion Field‘ that Jobs wielded so expertly. He had an effortless onstage charisma that went beyond special effects, scripted jokes, and frantic pacing.
He sold you stuff without you realizing it. Now, you can smell the desperation, almost as if everyone on stage is on commission.
I miss you, Steve.
The primary purpose of any developer conference within the technology industry is to create a forum for the host to establish a clear roadmap for developers writing software so that they will better understand how their applications can be built to take advantage of that company’s platforms.
If you are a developer, you want to see tools. You want to see fresh APIs, and you want to see fresh hardware that your applications can really take advantage of.
You want to see what is relevant to you that helps you make money with your software or value-added products. Not necessarily what is shiny.
However, because these are companies that are public facing, with stocks that are tracked on Wall Street, developer conferences have pivoted toward being more of a dog and pony show, especially the keynotes and the self-praise that these companies shower upon themselves for reaching sales and shipment milestones.
Apple is king of the developer conference keynote. Sure, Google and Microsoft are also experts at them and are students of the same school. But Apple perfected this.
The hyperbole thrown around at Apple keynotes are so ridiculous and effusing that I even created a drinking game to go along with them.
This year’s WWDC keynote was no exception. However, I kind of got the feeling that this particular installment was Apple’s turn at “Please clap” for product announcements. It was a lukewarm reception at best.
We saw modest improvements to WatchOS. We saw an updated MacOS. We saw long-awaited augmented reality APIs for iOS that are a few years behind what Microsoft and Google are doing. We got to see a glimpse of iOS 11, which has some nice but minor UX improvements.
The hardware, as usual, took center stage. We were shown HomePod, which is an expensive Siri-connected speaker that competes with Amazon Echo and Google Home at twice the price.
We got a lukewarm refresh of the iPad Pro, a bunch of spec-upgraded iMacs, and a wallet-busting, workstation-class iMac Pro, which starts at $5,000 and competes with high-end content creation and engineering systems from HP, Lenovo, and Dell.
In my estimation, the Mac refreshes received a disproportionally amount of time focused on them, considering that — as a product line — the iMac only accounts for about 9 to 11 percent of Apple’s revenue, and its overall market share in the content creation market, which is what the super high-end iMac Pro is addressing, has dwindled to almost nothing.
Nobody is using Macs for high-end engineering, 3D modeling, or medical imaging. Those workloads have for the most part gone the way of Microsoft Windows, and to a lesser but more focused extent, Linux. I know this because my brother works in that industry and has seen that market shift in person. And I know what his peers in Hollywood are up to as well.
The iMac Pro — and the yet-to-be-seen, forthcoming Mac Pro refresh — is addressing a market that just plain does not exist. These are symbolic machines designed to run software workloads that do not really run on that platform anymore. This is a machine designed for rich fanboys.
It is the personal computer workstation equivalent of a Bugatti Veyron. It’s meant to look pretty and show off to people that you have a lot of money. There will undoubtedly be some takers. It may even sell out in extremely limited quantities.
But like a million dollar hypercar, you’ll never unleash the full potential of the thing on the highway due to the fact you sit in traffic constantly and everyone else is doing 70mph. To get that car really cooking, you need to be a sheikh with your own racing track and ideally your own fuel refinery.
Sure, there’s still plenty of Macs left in publishing, but you don’t need a terabyte of RAM and 12 or 18 Xeon processor cores to do magazine layouts or even high-end paint stuff. The $1,500 to $2,000 iMacs are fine for that. So are the MacBooks. Of course, those workloads are almost entirely dominated by Windows now, too.
Do you know what high-end content creation and engineering types really want? Touchscreens. Styluses. Stuff that runs on powerful PC convertible devices like the Surface Pro 4, the Surface Book, and Surface Laptop, so they can actually draw on the thing.
You would think that would be the perfect market for Apple. This is a vertical segment just waiting to be exploited. But there are no touchscreen/digitizer capable Macs. Only iPads. OS X High Sierra, for all the attention that it got at WWDC, has no touchscreen APIs.
But the iPad is woefully unequipped to run real vertical market apps. It doesn’t have enough CPU horsepower, enough memory, and isn’t hardened for field use.
It would be theoretically possible to make up for this deficiency by having the cloud do the heavy lifting. That technology is real and Apple has the money and expertise to make something like that work.
But Apple’s cloud is not business or vertical-market worthy, and as a company, it has done an awful job in partnering with companies whose clouds actually are.
I was rather disappointed in the iPad lineup shown at this year’s WWDC. I was hoping that we would really see iPad really go “Pro,” with larger memory configurations and much more powerful CPU/GPUs. Apple’s iOS 11 is absolutely up to the task now, but the iPad hardware is not.
Yes, the higher refresh screen technology is nice from a consumer standpoint, but what professionals really want is 4K or higher tablet screens. You need a lot of video memory and a lot of CPU and GPU horsepower to drive that stuff in real time. It’s not easy to do this with embedded systems, particularly with ARM-based architecture and bus bandwidth, but it is possible especially if you leverage the cloud.
Apple’s marketing-speak aside, the current crop of iPad Pros are really just a small iterative improvement over the late 2015 models. I myself considered giving the new 12.9-inch a pass this time around, as I was already fairly happy with my 2015 first-generation unit given the apps I actually use. I ordered the new one only because I write about this stuff, and Amazon offered me $400 in credit if I traded it in.
If I didn’t actually write about this stuff, I definitely would have let it go. So, yes, I bought the new one, but under protest.
The artists and content-creation people who I know in the entertainment and publishing industry were hoping for something from Apple that had a larger screen with higher resolution, more powerful processing, and is touch-capable.
But many of them have already moved on to Surface devices, including the Surface Studio — which, in many ways, is the Mac Pro or the iPad Pro that Apple should really have built.
If they do use iPads now, it’s for light work, and it isn’t their main device. Apple has an iPad Pro that really isn’t built for Pros and can’t run Pro workloads.
And that is indeed very sad. But, hey, please clap.
Are you excited about the new iPad Pro or are you giving it a pass? Talk Back and Let Me Know.
Apple on Monday announced a laundry list of new products and software upgrades, including the next version of its mobile OS, iOS 11.
Various executives took to the stage at WWDC 2017 and sped through announcements, highlighting key features iPhone and iPad users will gain later this year when iOS 11 is released.
Naturally, Apple didn’t have enough time to go through every single feature coming to iOS 11. Thankfully, toward the end of the iOS 11 portion of the keynote, Apple displayed a slide that detailed more features.
There’s a lot on the slide, but a few of the features are worth pointing out:
Arguably, the biggest feature Apple didn’t discuss is the giving developers and users access to the NFC technology in iPhones.
Being able to touch the back of the iPhone to accessories and establish a Bluetooth connection, for example, is one benefit of NFC capabilities Android users have long had access to. With iOS 11, it appears Apple is finally allowing that to happen.
There’s a lot more, of course, in iOS 11 than what’s even on this slide. But until we get our hands on the public beta, which expected later this month, we don’t know the true extent of new features.
Hours before the WWDC 2017 kickoff, a file manager app for iOS 11 briefly appeared in Apple’s App Store.
There’s not an awful lot of information, but the app is undeniably an official Apple app.
A file manager for iOS has been a much-requested feature, especially among pro users, so let’s hope that this isn’t a fake out or just a rebranding of the iCloud app or something similar.
All eyes will be on Apple as the week-long WWDC 2017 kicks off with the keynote speech. And while Apple fans will be salivating over iOS and macOS updates, along with whatever shiny new hardware is unveiled, it’s Siri that will make or break the event.
This year’s WWDC will be an interesting time for Apple. For well over a decade the keynote has been a time when Apple could boast to the world — and more importantly, other tech companies — about how far ahead of the game it was, and how they had little hope of catching up.
But this time is different.
Apple is under pressure from multiple fronts. iPhone sales are teetering on the brink of a downturn, while iPad sales have crashed through the floor. And the company is lagging behind Microsoft when it comes to notebook innovation, trailing behind Amazon when it comes to getting hardware into the living room, and dawdling far behind Google and Facebook when it comes to AR and AI.
No matter what sort of face Tim Cook and the gang put on when they’re on-stage Monday, Apple is on the back foot, and it knows it.
But there’s one Apple product that could change all that — Siri.
It’s easy to look at Apple and think of it as a hardware company. After all, when most people think “Apple,” they think about the iPhone, Macs, iPads, and such. But the hardware is only part of the equation. In fact, building hardware isn’t hard — just look at how crowded the Android market is with devices ranging from treasures to total trash. Anyone with money can go to an ODM — Original Design Manufacturer — and get something drawn up and made.
What really sets Apple hardware apart is the massive software ecosystem at the back end.
Everything from iCloud to the App Store and the wider ecosystem, it’s all about the software.
And part of that software ecosystem is Siri.
Apple’s been playing it slow and carefully with Siri. Apple acquired the technology back in April 2010 and baked it into iOS 5 in 2011, and since then the technology has spread across Apple’s platforms to include macOS, watchOS, and tvOS.
But with Amazon and Google carrying out a full-on assault on the living room with speakers and smart devices, it’s time for Apple to push back. Because if it doesn’t, these players will slowly start to chip away at and erode Apple’s currently dominant ecosystem. After all, not only is someone who’s purchased a smart device from Google or Amazon less likely to buy a similar Apple device, but they’re also going to be looking at what else Google and Amazon have to offer, further weakening the ecosystem.
Now some might say that Apple already owns the living room, kitchen, and bedroom, because it owns the smartphone, tablet, and smartwatch market, and people take these devices with them as they move about the house.
But this is a weak argument.
With the exception of Apple TV, Apple’s hardware is aimed at a single user, and that makes it unsuited to home use. If your home automation setup is based around Apple, then you better make sure that everyone has iPhones and iPads, because it’s not like you can pick up someone else’s device to start controlling things. And this is where devices such as the Amazon Echo or Google Home really come into play. They’re devices that are aimed at the family, not the individual, and that means they’re far more convenient to use in a communal setting.
I’ve noticed it myself. While I used to have to turn to my iPhone to control the lights or some other bit of IoT kit, it’s far quicker and easier to bark a command at a nearby Echo or Dot than it is to whip out my iPhone and activate Siri.
And remember to factor in how cheap Echo Dots are compared to Apple products. At $49 they’re less than the price of two 2-meter Lightning cables from Apple.
But Siri is the tool that will let Apple keep up with Google and Amazon, because it allows the company to create an interface-free device that people can interact with semi-naturally (I say semi-naturally because none of the voice assistants are currently up to the task of dealing with natural language, and some form of linguistic gymnastics are always required, despite what you might see in the on-stage demos).
So, with all that said, here are my WWDC 2017 Siri-related predictions:
Apple is developing a dedicated artificial intelligence chip to offload tasks like speech recognition and facial recognition on its mobile devices, according to Bloomberg.
The chip internally known as Apple Neural Engine could help improve battery life and overall performance, the report said.
Apple is looking to include the chip in both the iPhone and iPad. Apple is said to have begun testing future iPhone prototypes with the chip, but it’s not clear when the dedicated chip could arrive or if it will be included in the next iPhone release this fall.
Apple has been rumored to boost its artificial intelligence offerings to more fiercely compete against Google and Amazon, who have seemingly pulled away in the AI market for the time being. The Apple Neural Engine could be included in future Apple products like self-driving cars or AI-powered glasses.
Apple could discuss its AI plans during WWDC in June, Bloomberg said. Its competitor, Google, introduced new AI offerings in May that extend to phones, connected speakers, and cars.
This isn’t the first time Apple has developed a dedicated chip. It included a dedicated M-series chip for motion in the iPhone 6S and a chip for AirPods in the iPhone 7.
Apple has an AR team that is made up of hundreds of engineers working on AR-related features. In March, Bloomberg reported it’s headed by Mike Rockwell, who previously ran the hardware and new technologies groups at Dolby. Rockwell is reporting to Dan Riccio, head of the iPhone and iPad hardware engineering groups.
Apple’s move into AR makes sense. The company has made several large AR-focused acquisitions, including PrimeSense and FlyBy, the maker of AR-camera software. Apple CEO Tim Cook has called AR a better technology than VR and for everyone, not just a niche market.