Hololens 2 FCC Approval

Perry Mulligan Transcript Excerpt from Alpha Select Conference 11/15

Official Transcript

Corresponding Presentation

Perry Mulligan at 9th Annual Craig-Hallum Alpha Select Conference (webcast)

Excerpted Transcript

The following transcript has been posted on MicroVision’s website for the reader’s convenience and prepared by third parties. Readers should refer to the audio replay, when available, on this site for clarification and accuracy.

9th Annual Craig-Hallum Alpha Select Conference November 15, 2018

Perry Mulligan, Chief Executive Officer:

Okay, good afternoon everyone. My name is Perry Mulligan. Welcome to this presentation. I'm here to give you an update on MicroVision, Inc. So with that as a backdrop, I'll let you spend 10 minutes reading the Safe Harbor statement [slide 2] and give you a sense of what we're about.

Everybody understands the relevance of the artificial intelligence platforms and the investment going into that ecosystem. We're not about that ecosystem.

What we're about is enabling unique input and output for artificial intelligence connected products, servicing four targeted verticals [slide 3]. So, we enable those products in a unique way to interface with the AI platforms.

If you think about it today, most people who have experience with AI enabled devices understand that they can answer simple questions that they have the ability to perform basic functions and they can listen to you, speak. With our components and our modules they are allowed to display images, they recognize touch and gesture and they sense with 3D clarity, the space that surrounds them.

We're focusing on the user experience, making it easier and more natural and seamless to interact with the AI products [slide 4]. We said we are targeting four verticals [slide 5]. Those verticals are the interactive products through our interactive display or display only through our licensed partner, consumer LiDAR through smart homes and security applications, automotive LiDAR, focusing on collision avoidance, and Augmented and Mixed reality through integrated display and sensor modules for binocular headsets.

How do we bring value to these solutions, these verticals? It's really through the embedded sensing and edge compute modules that we make and sell, enabling high fidelity. user interfaces, services, security and driver safety [slide 6].

Let's start to talk about what those applications look like those verticals look like. Everybody understands that the smart speaker market is growing, but today still very limited to voice activated transactions. The numbers you see there units shipped in Q2 represent a huge marketplace, but it's a marketplace that's hard to monetize today using voice only.

With our module embedded into that smart speaker, we can add interactivity [slides 7-8]. We project a bright, wide display image using time-of-flight, allowing people to interact with that display image, as though it's a capacitive touch screen. We have a very natural instant-on characteristic to it, so when it's not required, if the unit's not visible, it's not projecting. It's constantly in focus and it's very user friendly to apply.

Furthermore, we've taken and license the display portion of that, not with the interactivity to a display partner [slide 9]. We announced that license a few months ago. The partner we picked was someone who shared our go-to-market strategies, so they too are focused on products that are connected to AI ecosystem. They shared our go-to-market strategy, they have the right attributes to be successful and we really believe that the volume component purchases that they were going to bring through their products was going to help us achieve reductions throughout the entire supply chain.

So, we're excited about the economy of scale that this is going to bring to us. The basic elements of the agreement is that they paid us $10 million in license fees this year, 2018, that they will be purchasing from us components our ASICs and MEMS as an ongoing basis and that there are minimum hurdles required to maintain that exclusivity on an annual basis.

I think it's pretty obvious, and I won't spend any time here illustrating again, that the smart speaker market is growing at a pretty exponential rate [slide 10]. What's of interest is when we think about the fact that they are growing quite considerably in Asia as well. So it's not only a North America phenomenon, but rather a global phenomenon.

One of the verticals that we targeted and that we wanted to bring to your attention here was consumer LiDAR [slide 11]. In this space the image that you see on the screen at the bottom center, right, it's about the size of a Snickers chocolate bar. That device has a very high-density image point cloud that it predicts or gathers at approximately 20 million points per second.

With machine learning embedded at the device with very low power and very low latency, we can pass to the AI product a very strong sense of what's happening in that room. We, in essence, become the eyes of the AI product, but the marvelous thing about this is, as you know with the LiDAR product, this is X, Y, Z data and some density information.

So, if somebody has to try to tap into that as they would with perhaps a camera feed, they'd get a set of information out of it that's unintelligible. It's not like having a camera on you in your house because nobody can tap into it to get that information out. It's very difficult. So, this allows the AI platform to have the sense of the space that it is working in, have a sense of what's going on in the room, the number of occupants, the number of things that are going on. And it gives that AI platform and that AI connected product a way to help control a smart home in the future.

When you think of Automotive LiDAR [slide 12], people asked us last year, why are you not talking about autonomous driving vehicles? Aren't you talking about 200-meter LiDAR? We absolutely believe that the solutions we had were relevant to collision avoidance. And it's interesting to see that with the safety regulations that are coming to pass in Europe and subsequently through North America at Level Three, collision avoidance is, in fact, the market driving volume element of that play.

So, I fully respect the autonomous driving vehicles are a 40-year out, 20-year out, 30- year out timeframe, but the near term market demand is going to be covered by Level Three, safety requirements, collision avoidance, emergency braking, emergency steering, and we believe our technology gives us an advantage to be able to provide those solutions to the automotive makers. And I think everybody in the room understands the significance of the volume and the significance of the materiality of that play.

Finally, if we think of Augmented and Mixed reality [slide 13], I'm sure that each of us in the room would have a different opinion of when that market becomes mature. I'll leave it to you to decide whether that's 2020, 2021 or beyond. But if you believe that Augmented and Mixed reality or Augmented or Mixed reality is going to have an inflection point and displace Virtual Reality as a prime source of use case, then you have to believe that laser beam scanning technology is in fact a solution that's required to make that happen.

Our technology provides a resolution of display, wide field of view, low persistence, small size, small power, small weight - factors absolutely required to enable that technology to mature. So we're very comfortable that our core technology allows us to be a predominant player in that space.

I've been with the company a year now and I was thinking, how do I help convey to you where we've come in that period of time, and I thought it was pretty interesting to me as I thought of what we were trying to convey last year and where we are today.

The team has absolutely demonstrated thought leadership [slide 14]. Last year we were trying to convince you that AI was the product and that products connected to it needed to have input, output capabilities. I think that decision is obvious and conveyed. I think we told you last year that we were focused on collision avoidance in the automotive space and that decision is, I think, being played out and is obvious.

Relevant now, I think we're showing thought leadership as it relates to our consumer LiDAR approach to problem solving as well. Second were migrating as a company, we're going up the evolutionary curve, moving from a solutions provider technology element that was typically handcuffed to licensing its product solutions to someone else into a company that's going to scale and move into module production with our technology.

We have the partnerships with the OEMs and key suppliers that will allow us to bring those products to market and deserve, achieve the revenue deserved from that. And finally, we believe our products in each of the verticals are already aligned with the OEM product path. I don't think we're in this stage where we're a product looking for a solution. We're actually providing solutions to meet that OEMs have with their product portfolio.

For those of you unfamiliar with us, I wanted to explain how our engine works [slides 15- 16]. The display engine, the extremely tiny item down on the picture on the page represents probably the size of the end of your thumb. And it allows us using control discreetly have a red laser, green laser and blue laser and MEMS mirror to control the picture dot on each place that we paint that image. And subsequently, if you see how that works, using a 3D or our MEMS mirror or 2D scanning mirror in a raster pattern, we paint pretty pictures. They're high definition pictures, they're constantly in focus pictures, and they create a relatively wide viewing experience.

So, if we now expand that and say, well, how do you generate a LiDAR image out of that [slide 17]? Fortunately for us, all we have to do is add an IR laser and a photo diode. And with that coupled with our time-of-flight capabilities, we can generate the X, Y, and Z coordinates of any point in space in front of this scan. So, we end up with a LiDAR image of that environment.

[Video Presentation Start, slide 18] .....I encourage you to go to the [MicroVision] website and take a look at it [video demonstration] and I think you’ll find it a very easy, very seamless to use, a transparent experience for anyone that walks up to the device to order items, very efficiently, effectively. Think of it as the micro-transaction - the ability using a smart speaker to interact and seamlessly order food, order devices. One of the tests that we think of is ordering $400 or $200 worth of groceries from Whole Foods within four minutes going through the menus and the selections, and that's what we allow people to do using that interactive display.

So, at the basis of this, somebody would say, it sounds like you have an interesting portfolio [slide 19]. Why would not a large entities simply come and do it faster, cheaper, better? The answer to that question lies in the fact that it's taken us a long time to get here and this is a journey that's a byproduct of a lot of work, including 500 different patents that we've filed, our core IP wrapped around our embedded software.

Again, if I think of where we were last year at this time, we were talking to people about adding machine learning and machine intelligence to our sensors to allow them to do things very quickly and efficiently in a low power manner. And people were telling us that the market for those skills was very tight. But we've managed to accomplish that and are doing that and we're shipping products with that into the Dev Kit platforms we're releasing today.

We have custom ASICs and MEMS that again are differentiator for us, custom hardware and finally advanced manufacturing capabilities that allow us to compete and scale, as required.

From our financial highlight perspective [slide 20], based on the numbers from the September 30, you'll see that we’ve basically doubled year-over-year as far as revenue is concerned. We carry zero debt on our balance sheet. We have approximately 93 million shares outstanding and warrants and options at around the $2.50 [slide 21].

We've talked repeatedly about having the right product and the right technology, but I think you all recognize that without the right product, the right technology at the right price point, when the market needs it, you don't necessarily have revenue.

So, when we map the four verticals [slide 22] that we're talking about onto that grid, we look at it and we say we definitely have the quality of features and the right price point or cost point for Augmented and Mixed reality. But we don't believe the volume for that products will have an inflection point in 2019. We actually think those volumes happen further out and again, I'll let you determine where that is.

So, when I think of 2019 revenue opportunities for us as a company in that space, I think there's a chance we will sell a small number of units, but we will probably have a chance to do non-reoccurring engineering work for people in that space.

With regards to the IoT products, which are our display, and interactive display or display through our licensed partner, we absolutely believe that the quality of the solution we're providing today meets what's required in the marketplace. The cost structure is appropriate and scalable and that the customers’ products or market timing is ideal. So, we are of the mindset that we expect to see volume productions or volume ramp at that product in 2019.

Essentially, consumer LiDAR on the other hand, while it has the quality features and costs that we expect, we think that it's disruptive enough that this product is going to take a year of our customers working with it to be able to launch and decide the features they want to bring to market with. So, we're counting on that as a 2020 potential launch candidate.

And then automotive LiDAR again, we think might be further out. We believe that the technology and features we have are ideal. We think we're approaching it from the right cost perspective, but we think more research has to be done to bring that solution to market. So, we're looking at that as a 2020, 2021.

Again, I'm stressing the fact that we're customizing these solutions and bringing to market these products predicated on a strong core base of IP and those customizations being applied to each of the vertical markets.

To help you clearly understand, what we're trying to accomplish here [slide 23] we expect to sell our interactive display and our consumer LiDAR modules with software to manufacturers and OEMs. We're going to sell component parts and software to licensee and display-only modules. And as some of you may know, we have a $24 million contract and we expect to sell components to that $24 million contract owner.

The keys to ramping this [slide 24], requires that we have world-class ODMs and contract manufacturers already in the supply chain for OEMs for modules. We're not trying to reinvent this and we're not trying to get qualified. We are already affiliated with these folks.

The ODMs allow for low manufacturing investment and control of expenses during this ramp. World-class large contract manufacturers help reduce our financial and working capital needs as we go through this inflection point. And common components between the display-only interactive display and consumer LiDAR products should drive the costs down due to shared volume.

But even more fundamentally, it starts with a manufacturable design [slide 25]. We've approached this with a sense of consumers scale. We've leveraged industry standard MEMS, silicon, ASICs and lasers. We use highly automated manufacturing processes proven to operate at high first-pass yields. This is not requiring us to engineer and invent on the shop floor. We've established manufacturing lines that are easy to replicate so we can scale as the volume increases.

And we can leverage a balance between our in-house process experts and the in-country manufacturing experts to support to scale. So that's why we feel comfortable that we're going to be able to accelerate and achieve the inflection point that we expect to have.

Historically, we, as a company have offered four different paths to generate revenue [slide 26]. We've done module sales. We've done custom modules and components. We've licensed modules and we've provided engineering service. I hope after this conversation you clearly understand that our focus is to capitalize in the module sales that we think we have in front of us.

Lastly, the company is focusing on large emerging markets [slide 27]. They have significant opportunities and we said repeatedly on our earnings call that in our desire to get to breakeven profitability at some point in time in 2019 with that desire, any one of these verticals has the capability, the volume, the demand that could achieve that for us.

We have a strong experienced management team. I joke that I think that's a polite way of saying we're getting old, but I think we've been around the block once or twice and know how to get this done.

We have a great intellectual property portfolio and we're leveraging that. And I hope you understand that the leverage means that a lot of the deep, R&D work required to make these products and these solutions available is done. We're now working on customizing these solutions to make them work in the specific applications. And we have opportunities in front of us with our own modules, with the $10 million licensee agreement that we signed this year as well as the $24 million contract we've been working on since April of 2017.

So at the risk of running over, I wanted to say thank you for taking the time with us today. We're going to open up for any questions that you might have, tried to get us back on schedule but not quite there. We're open for questions.


Question You talked a little bit about having visibility on the smart speaker market, whether that's on the display-only or on the interactive side. Can you talk a little bit about why you feel like you have visibility there and when we might share in that visibility?

Perry Mulligan, Chief Executive Officer The wonderful thing about working with large OEMs is that there is a cone of silence about – just about everything you do. So we know that we're not going to divulge much in the way of forward-looking activity. I joked with someone that the first time you'll know where there is after the unit ships and somebody does a tear down on it and they would see that it would, you'd see MicroVision inside and you'll get a sense of what the cost of the bill of goods is.

The natural cadence that we would expect to see is that you would begin to see a product readiness. Lead times for silicon are pretty extended. So anybody in the room that monitors technology understands that those are reasonably long. So when you start to see some of those purchase orders or activities from us that, that identify those purchases that would be evidence of it. I don't think he'd ever get clear indication until the product shipping as to who the customer is and what the exact form factor is might.

Question So I guess it's safe to say that you have – you're starting to get very specific – internally specific, now these targets and your products that are going up?

Perry Mulligan, Chief Executive Officer Well, there's two things, I think we make it sound like this is – with a little bit of luck, we will fall onto the solution. I want to make sure that people understand, it's a journey of a couple of years. Right? We're not at this stage of a product readiness without having had many discussions of what solution we're trying to bring to the market, what problem we're trying to solve for the customer.

During our last earnings call, we actually alluded to the fact that we saw money move out. We thought we were going to have some NRE monies from our licensee partner, but it turns out that the reference design that we provided didn't need to be customized. That the feedback they're getting from the OEM implies that it might be adequate and sufficient. It does two things. It tells me that they're getting traction or they might be getting positive feedback. And it also tells me that it reduces the likelihood that there's a delay. Right? Because anytime you have to customize something that takes time and energy. So, we've characterized this through the course of this year is the journey of a thousand steps, you know and I know no can be [said] at anytime but we're feeling pretty, pretty optimistic that we will be successful.

Question In this context and in respect are you – in some of these different end markets are you sole sourced – it is a pretty specific technology. So are you in sole source situation or its [indiscernible]?

Perry Mulligan, Chief Executive Officer Well, let's look at the smart speaker market. Last data point, I think we showed up there was 85 million units expected to ship in the year of 2021. We expect an install base of 350 million. If I’d say, identify the fact that I was going to be sole sourced and those smart speakers, I think we'd be having a different conversation. What I will tell you  though, while I do respect across that plethora of solutions, there will be many different people trying to experiment with ideas and how to help monetize that AI platform, the investment in the peripherals.

Within the context of that, some of the features that become pretty important if you recognize that smart speaker’s voice activated are ultra-sensitive to fan. They use MEMS’ mics, so vibration of any sort, disables them to a large extent. So when you think of that, you say, okay, well if I wanted to have a nice large display, is there another projector type solution that's available.

It becomes hard to find one that fits the power requirements without a fan. And then if you couple that with the notion that says not only does it have to display, it has to have 3D interactivity and I don't want it to be a light plane. I wanted to actually be LiDAR based so if you have a cutting surface in front of it, it can reflect, it can capture the fact of different heights. It doesn't have to be perfectly level surface. It's always in focus. Then I'd say to you within that context, I can clearly tell you I believe we are unique in our space there. So I think we've got something that has some legs that solves a problem for the OEM that has multigenerational applications to it.

Steve Holt, CFO I’m Steve Holt, I’m the CFO, you just – I’m not sure where you’re going with that question exactly but a part of it is – it is your thing about how the vendor or how the customers might look at the risk of supply chain. If we have ASICS and MEMS that made by STMicroelectronics and other top tier suppliers, so no concerns about them and their ability to ramp. Also the actual modules that we sell are manufactured by a company in China that’s several billion dollars that's in the supply chain for the tech Tier 1s already. So there's no real concerns for them ramping. So our supply chain, we think is in really good shape.

Question Steve, still got a question with your potential and the opportunity and your understanding what percentage of the market you could have, what percentage of the customers you could have that – still be the question?

Steve Holt, CFO Right. I think when you look at the installed base of these smart speakers. The first version of them, right, is voice-only. And I think we're up to somewhere and we’re almost 60 million installed. It's in that ballpark. The next versions are trying to figure out the right solution for a display and interactive. And so you can see that there could be a whole other generation and where there's a very high percentage that are of that.

Question We get – next-generation will get to be interactive, will it be multiple companies competing for that technology so if I’m Amazon am I going to try to use multiple vendors. might you be one of them or?

Steve Holt, CFO We don't think there's anybody that can do what we can do. Yeah.

Question If you think about – there is ways to have displays that are not just projection right, there is LCDs, there is both plasma, it’s across geographies – it is a different type of technology, but so if you are talking about a low power, low heat, you don’t require a fan, quickly on projection solution that’s a solution on some of those – I think what they are saying is true – there is really no competition in that?

Perry Mulligan, Chief Executive Officer And if you think of it just if we pull the thread on that a little bit, when you think of a 15- inch display or 20-inch display, most people that we talked to say, they don't want to have more black screens in their house but they don't want to have more real estate allocated to that kind of a device. So we think not to mention the cost advantage we'd have at that size and performance.

So we think that there are some inhibitors that would prevent some competing technologies to be fit form and functions the same. Whether or not are the people trying different versions, we encourage them, lots of space for everyone.

Question Your license – your $10 million license deal that is one and done or is that an annual deal?

Perry Mulligan, Chief Executive Officer The 10 million was one-and-done, the reoccurring requirements for them to support the exclusivity imply that. And I think we did the math on this, Steve, we said that it was approximately, you think of $20 million a year in component purchases from us to maintain exclusivity.

Question Gross margin on that?

Perry Mulligan, Chief Executive Officer  I think – and again, I don't believe we ever talked about that, the gross margin on the contract per se. I think historically, Steve, we've talked about gross margin at a component level and gross margin at a module level being somewhat different.

Steve Holt, CFO So we think on modules, it's moving towards 40, on component sales it'd be more closer to 25, 30 range in those ballparks.

Question So what is the incremental cost of the OEM?

Perry Mulligan, Chief Executive Officer For the difference between the two?

Question I’d probably say you said it is an add-on to the street, what is it going to cost the consumer, where is it going to cost?

Perry Mulligan, Chief Executive Officer Well, I think all of the OEMs recognize that there's a high price sensitivity. So this is MSRP. It's really not our cost. I don't believe that anybody here is working on the premise that by simply tacking on an additional cost onto a speaker, that it's going to stand-alone. I think it has to fit within the models that they have. Question So how was – what are your costs?

Steve Holt, CFO Yeah. We're not saying what our costs are per se, but we know that these guys want to sell. It has to be absorbed at $199 – and $149 and $99. They want different versions of the product.

Question Is it I got – you’re saying you’re the eyes of the AI, the eye of AI.

Perry Mulligan, Chief Executive Officer I/O.

Question Right, are you familiar with some of the security technology, it seems like a perfect application for security cameras, am I wrong to think it as another vertical for you?

Perry Mulligan, Chief Executive Officer Well, again, we really think that it becomes much more linked to the smart home hub, right. That we allow the AI interface device to have the level of awareness it needs in as many rooms as you want and then controls that information flow back through the other APIs that are open and connected with it because all of these devices connect into…

Question But I understand, we are – these people put cameras in office buildings, cameras in home, but they’re not intelligible cameras, right?

Perry Mulligan, Chief Executive Officer Right, yes.

Question [indiscernible] its not something call, off-the-shelf it is a – knowing that there is few people that are entering the room, you can tell that you can identify them, I don’t look – especially in the big office building, you know security people can’t look at the all the cameras – they are out there now because learning – learn faces and movements and recognize the things that this seems almost a better application than just AI?

Perry Mulligan, Chief Executive Officer So if you think of it though, again what we're suggesting is that just as AI is prevalent in your home today, we expect AI to be prevalent in business. And we expect that we will enable those solutions as that evolves, right.

Question You expect it to be on a factory floor?

Perry Mulligan, Chief Executive Officer Wherever they want to take it.

Question So I’m fairly new to this story but I’m just wondering are you in current Amazon shipping products, all these side of things or do you want to…

Perry Mulligan, Chief Executive Officer Yeah. We're not shipping products with Amazon today. We are talking about an interactive display solution for potential using the…

Question You already have this spot vehicle, spot and other products that already have LCD screen. So is there a road – are you in line with the roadmap for Amazon to go towards the display projected image [indiscernible] are you going to see this product works well, or how does that work in them?

Perry Mulligan, Chief Executive Officer Yeah. I don't believe that I'm qualified to talk about Amazon's strategy across their entire speaker portfolio. All I can tell you is that we believe within the range of speaker product solutions, that there is space for voice-only, activated devices, voice plus display and voice plus display and interactivity. So we think there's enough space in there for all three of those solutions to coexist and how they want to deploy it as someone else suggested there's going to be a plethora of solutions. I'm sure that they experiment.

Question Well, there is no current roadmap [indiscernible]

Perry Mulligan, Chief Executive Officer I can't comment on their product roadmap strategy.

Steve Holt, CFO I think what we're saying is we're engaging with – we can identify A party or so we're saying Tier 1 OEMs, which is Amazon, Microsoft, Apple, Google, those kinds of players that have an AI platform. And those are the ones we’re engaging with and we're targeting to launch a product in 2019 within that smart speaker space and Internet of things space. And so that's what we're able to say today.

Perry Mulligan, Chief Executive Officer Thank you all.