June 27, 2016
Take 3, Scene 9: The WWDC 2016 Experience
In this episode of Take 3, we continue the conversation about WWDC 2016 – Apple’s Worldwide Developers Conference – with 3Pillar’s Sean Kosanovich. He joins us in the studio to talk about his experience at the latest WWDC, how it compares to previous years, and what to look forward to from Apple.
- Sean discusses how WWDC has evolved over the years and his excitement over the shift of focus from Objective C to Swift
- We talk about what Apple’s choice to open its services to third party developers will mean for the future of both the development community and Apple’s reputation
- Sean shares the story of how he met Apple royalty at this year’s WWDC
About the Guest
Sean Kosanovich is a Senior Software Engineer at 3Pillar with a background in full-stack development and a focus on native mobile applications.
To watch Craig Federighi’s portion of the WWDC keynote, tune in on Apple’s website. Craig’s portion of the keynote runs from 35:25 to 66:15.
Read the Transcription
Julia Slattery: As someone who has attended the last three WWDC conferences, what can you say about how it has changed and evolved over the years?
Sean Kosanovich: Honestly, not much has changed, and that’s I think a good thing. The most valuable part of conference is the sessions and the one-on-one time you get with Apple developers. And if anything, that has increased over the years, so that’s a good thing since that’s the most valuable. There were a couple of small changes that I’ve noticed; in previous years, obviously Swift was very new and a lot of the sessions were still in Objective-C. However, this year, every session was all done in Swift and I think that really points to Apple’s future and where they’re going. They have kind of phased out Objective-C from their developer conference.
Julia Slattery: What would you say was the highlight of your experience this year?
Sean Kosanovich: So this year, like I had mentioned earlier, was all about Swift. And being a big Swift lover, for me, seeing Swift mature was really cool. Back in December, Swift was open sourced and the community is really taking to it. So Swift 3, which is going to be out this fall, has been a collaboration between Apple engineers and the open source community. And the biggest goal with Swift 3 is to be source compatible moving forward. So what that means, as you write Swift 3 code now, when Swift 4 and 5 come out in the future, it will still work and you won’t have to change it. That’s been a big pain point for developers. A lot of developers have always complained about why are they changing the syntax, why are they doing this, they can’t get it right. But I think people fail to realize that Swift isn’t even three years old yet, and Objective-C is over 20 years old. So it’s not a mature language yet and seeing that maturity coming in now is awesome.
Julia Slattery: One of the biggest announcements from the conference was that Apple is making more of its services available to third-party developers. Do you think this is just a response to the other voice services like Amazon’s Alexa? How do you think this will impact the development community?
Sean Kosanovich: Yeah for sure. A lot of the iOS 10 features has been about expanding through extensions with the Apple’s core applications, and certainly some of the extensions are in response to competitors such as Alexa. Before I get into that, some of the extensions that they did allow were iMessage apps, so I really think this is probably a response to stuff like Facebook Messenger, WeChat, and Slack. They’ve had these rich chat platforms for a while that can do apps and stickers and all these fun things that consumers like. And I do think that that’s going to really resonate with users and maybe even pull people into the iMessage ecosystem as well.
Another extension point is with Siri. As you probably saw, Siri is quasi-open to developers. I say quasi because it’s not a full-on API like Amazon’s Alexa or Google’s new voice assistant. But this is definitely a direct response and Apple is kind of taking a very cautious approach by only opening it up to certain things like paying friends or ordering an Uber, for example. It’s very limited to what they can do; I’m assuming Apple is doing this so they can better control the experience and then expand from there, but it’s definitely not entirely up to the Alexa API yet. Apple also added extension points for Maps. So now you can do restaurant reservations or order an Uber directly in the Apple Maps application, which is really cool because I think the less moving around users have to do between apps to accomplish a task, the better. And I think it’s just going to make a much better experience.
Then the last two were these rich notifications in home screen widgets. So the rich notifications, it’s essentially you can have a widget as your notification and allow users to interact with your app without actually having to open it to do common stuff like respond and stuff like this. In the home screen widgets, when you 3D touch on an app icon, you could actually display a widget right there on the home screen that users can interact with. So they don’t even need to open your app. And I think, again, this is really going to help enhance the user experience.
I do have one concern, and I think that if this is executed well, the new extension points are really going to resonate with users. However, opening up your core applications to third-party developers can lead to some issues. Now Apple’s quality – which a lot of the times they are known for not having an extreme number of bugs and so forth – could be impacted. So if the Uber app is poorly coded for – I’m not saying it is, but for example, if the Uber app was poorly coded and it’s crashing the Messages app or the Maps app or Siri, a lot of users – they don’t really know the difference between what’s responsible, they’re just going to see their phone crashing all the time. So I really hope that Apple is going to take a much harder review stance on applications that are going to integrate with the core apps. And if they do that, I think this will be a really useful feature for the users.
Julia Slattery: There’s also been some buzz around Apple’s iOS 10 release being a copy of Google’s latest OS, and it’s going to start a mobile war. Do you think this will bring about the dawn of a new mobile age?
Sean Kosanovich: Yeah I think the mobile industry as a whole has kind of hit this plateau. If you look at the new Android mobile operating system, AndroidN – which is in beta now – it borrows a lot of features from iOS that it’s had for a couple of years, such as split screen multitasking, picture-in-picture and actionable notifications. These are things that iOS had for years. So Google was blamed for copying Apple and now we have Apple playing catch up to Google.
I think the industry is really looking for the next big innovation. I don’t think it was there with 3D touch or touch ID because they are easily copied by other device makers. So I think the mobile industry as a whole is looking for a new innovation and until that new innovation comes along, whenever that is, I think there is going to be a lot of copy, give-and-take between Apple and Google and the Windows phone and so forth.
There were a couple other features that Apple announced that people have really made a point that Google has had this for a long time. One of those is Apple’s new photos application on iOS 10 can now recognize objects and scenes. This is something that if you’ve used Google photos, you can go to Google photos and search for snow and it’s going to turn back every picture in your library that has snow in it. Google does this by doing server-side algorithms on the back end, which is a machine learning technique. So what it learns maybe from your picture, it’s going to use that same data on Bob’s pictures. It’s not very good for privacy but it gives you a really cool feature. iOS 10 is actually going to have the same exact feature, this object and scene recognition. However, Apple is not doing it in the cloud; they are doing it on the device and they are doing that for privacy reasons. So even though Apple is playing catch-up in the end-user feature, Apple is really concerned with the user privacy and they are just doing it in a different way from Google.
Julia Slattery: So since the last WWDC, Apple announced the large 12.9 inch Pro and the 4 inch iPhone SE. Did Apple follow this trend and announce any new tools this year to help developers accommodate the new screen sizes?
Sean Kosanovich: Yeah certainly. One of the new features of Xcode 8 is an interface builder called “Preview.” Preview allows developers to see how their applications look on the varying devices without having to actually compile and run the application. This is going to save a ton of time.
Across the bottom toolbar of Interface Builder, there is now every device that your app is targeted – from the larger iPad Pro all the way down to the smallest iPhone – and you could choose in between any device and orientation and see exactly how your app is going to look in real time. Another really cool thing with Preview is, you can actually select which devices you want to edit your layout for. So maybe you have a label you only want to be shown on the larger screen iPads – you can just select those iPads, do your edits and hit save and now that label will only be for that iPad. That’s going to help developers a lot with this adaptive UI. Apple took it even a little bit further – a lot of developers are probably used to constraints or they are probably more used to fighting with constraints in auto layout, but now Interface Motor automatically generates your constraints for you based off your current layout, which is really cool. For the times that the constraints aren’t right and you have runtime issues where something is not looking quite like you hoped it would be, Interface Builder, the new view de-bugger, can actually point these out and it’s not an archaic mess of random unique ID constraint errors. So there’s been a lot of changes there to help developers create adaptive apps.
Julia Slattery: I heard that you had a brush with Apple royalty while you were at the conference. Could you tell that story?
Sean Kosanovich: Yeah certainly. Always keep your eyes open for Apple executives, as they often walk around the conference just randomly. And one day, I was sitting off to the side doing some work and someone comes down next to me and I look up, and it’s actually Craig Federighi, who is Apple’s Senior Vice President of Software Engineering. He just casually sat down next to me like he was any other attendee, and he pulls out his MacBook and he starts doing some work. At first, I wasn’t sure if it was him because it was just so casual the way he sat down, but then I started talking to him and we talked at least for a good 10-15 minutes, just about everything. He was asking what I do, how I was liking the conference, and he asked about 3Pillar Global. So that was really cool. At the end, before I had to get going, I asked him for a picture. And I think it was at that time that everyone else realized that it was Craig Federighi because then there was a huge line of people wanting to take a picture. So that was kind of funny.
Julia Slattery: That’s awesome.
Sean Kosanovich: Yeah, it was a lot of fun.
Julia Slattery: What a great story to come away with.
Sean Kosanovich: Yeah, right? I was joking with people that I knew were there, that Craig and I are buddies now.
Julia Slattery: Obviously, you got a picture two years ago, and now you’ve got a picture this year.
Sean Kosanovich: So we must be best buds now.
Julia Slattery: That’s exactly what it means.