Almost two years ago we sat down to put together all of our thoughts about testing and testability for the fledgling Alexa platform. In light of recent events causing us to link out that article a few times, we decided it may be time to do a bit of a retrospective on the topic, and present our view of where things are today.
It's a little out of the ordinary for the content we post here, but one of the things that we at 3PO-Labs (and specifically Eric) find ourselves doing with some frequency is advocating on behalf of other developers who end up in situations they don't know how to resolve. Often this happens during the certification step, where a first rejection can seem like an insurmountable obstacle, especially for folks who are less familiar with how things work behind the scenes than we are. We recently took some time to argue on behalf of a few folks, and I wanted to share what that looks like a bit more broadly...
We wanted to take just a moment to talk about a new Alexa feature, our implementation of it in AstroBot (our space API aggregator skill), and thoughts about how to best take advantage of the new capability.
As mentioned in our previous post, we've had the opportunity to play around with the new Alexa push notifications feature for some time now. While the exact implementation details on the Alexa Skills Kit side are still not public (and therefore not something we can talk about yet until the public beta goes live), there are enough consumer-facing pieces that we CAN talk about that should be of interest to folks starting to think about their push notification use cases.
So, some exciting news from Amazon today - the first batch of skills with push notifications enabled from the private beta have finally been released. Unfortunately, in their official post Amazon called out several skills as exemplars of the new feature, but failed to mention that we at 3PO-Labs were also in the beta, and also have a live skill with notifications: AstroBot. In fact, due to a clerical error with certification a couple months ago, we were actually the very first third-party skill to go live with notifications. We've been incognito ever since, but we're relieved to finally be able to talk about what we've done...
Hi everyone. We at 3PO-Labs have made no secret of our frustrations around testing the Alexa voice/intent model. Around this time last year I started playing around with a sloppy hack to address the problem. I've fiddled with it off and on during that time, but a recent renewal of interest in the problem has pushed things to a point where I can start to peel back the covers on what I've built.
Quick note for anyone in the Puget Sound area - we here at 3PO-Labs are organizing a recurring Alexa meetup, and we've just announced our first session, July 27th. Details can be found here: https://www.meetup.com/Seattle-Alexa-Meetup/events/241808747/
Hope to see you there!
Welcome to a new column at 3PO-Labs, which we're (for now) calling VUXcellence. The idea behind this series is to shine a light on interesting aspects of voice UX that different developers are utilizing that make their voice interfaces more intuitive, efficient, or natural. First up, we're gonna take a look at TsaTsaTzu's massive RPG "Six Swords".
We're 3PO-Labs. We build things for fun and profit. Right now we're super bullish on the rise of voice interfaces, and we hope to get you onboard.