AWS re:Invent was the event to convince that AWS is still at the forefront of innovation. A different wind has been blowing in the IT world since COVID, and that wind has picked up with the introduction of generative AI. AWS seems to have a strong headwind where Google and Microsoft are sailing along on that wind.
The new wind in the IT world calls for more simplicity. Organizations want to innovate faster, but not by adding complexity. Innovative solutions may be complex as long as they are easy to implement and manage. IT decision-makers are currently being won over by simplified complex solutions. The problem we see with AWS is that they don’t seem to speak that language. AWS has become big by appealing primarily to developers. You can see that in the keynotes where they discuss extremely complex processes. Today, CIOs are more in the lead than ever, they are looking for efficiency, innovation and how it helps the business, and they drop out on these complex AWS stories.
AWS mainly appeals to technical people, not the IT decision maker
AWS primarily appeals to developers to market those massive amounts of PaaS solutions. After all, if they have anything at AWS, it’s a choice. AWS has a tremendous number of services and variety. If you want to build a cloud-based software solution and want complete freedom, AWS is the place to be. That’s also the language AWS speaks, that of the developer. However, if you want more simplicity so you can innovate faster, then you often benefit more from a selective menu. Google Cloud and Microsoft Azure then increasingly have an edge. There is also a big difference between the cloud players. AWS is good at developing a massive number of services and offering variety but poor at marketing a product.
By keeping it simple, you can innovate faster
A few years ago, Google and Microsoft had already figured out that AWS was trying to develop as many services as possible and offer enormous customer choice. The advantage of having such an extensive portfolio is also AWS its pitfall. That huge portfolio creates complexity because you quickly lose the overview. They also created an expectation at developers that every service has many customization options. That also makes solutions a lot more complex to manage.
At Google and Microsoft, they make choices and opt for overview and more simplicity. This allows them to market their cloud more as a product. Take Google Cloud as an example; they have, just like AWS, a broad cloud portfolio in which you can basically run all your workloads. On a PaaS level, however, there are less choices. Google offers several standard solutions that the market needs. If you still want an alternative product or a variation, then you have to rely on the marketplace or you can roll it out yourself. In doing so, Google forces its customers to work more with the standard solutions. That means the choice is more limited, but that’s fine for 99 percent of organizations.
It’s a bit like an oversized menu in a restaurant. If you have 20 options, you’re done choosing in 5 minutes. It takes much longer if you have 20 options with another 200 variations.
Amazon Bedrock presentation made generative AI too complicated
We also saw this complexity in the keynotes when they talked about Amazon Bedrock. AWS is jumping on the generative AI train with Amazon Bedrock, but the way it presented Bedrock and generative AI was too complicated. Developers working with AI can understand it, but the average IT decision maker dropped out within 2 minutes. At AWS, they still find it necessary to explain on a technical level how to duplicate and modify such a large language model (LLM), discussing the whole process of connecting data, vector creation and scaling with Ray. For the average developer who is new to AI, it was catching up to understand it all. This is where AWS fails in their go-to-market strategy. It’s good to share this kind of information in a smaller room somewhere during the event, but not in the opening keynote. It clearly shows that AWS does not know how to market a product to IT decision-makers.
It also stated during the keynote that customers who have already started working with Amazon Bedrock needed support from AWS to get it all working correctly. In doing so, AWS confirms that they haven’t yet automated many of those complex operations surrounding LLMs making them easier to use.
Not chic: indirect lashing out at competition
What you often see with large U.S. companies that come under pressure is that they start lashing out at the competition. When they sink really low, they do it by name. These less chic practices have also tempted AWS. We didn’t expect that. In the case of AWS, by outlining situations where everyone knows which competitor is meant. For example, something was said about a flood and fire in a data center that caused a Google Cloud availability zone in Paris to be offline for weeks. By the way, it’s not like AWS never suffers from outages. It was also mentioned that Amazon has already presented its fourth-generation ARM Graviton processor, while the competition has yet to start. Microsoft Azure presented its first ARM chips a few weeks ago.
That AWS chooses to go the route of talking about competitors this way is allowed. However, CTO Werner Vogels told Techzine a few years ago that he found it childish when Oracle did the same thing to AWS. Vogels said that people should focus on their own innovations and, above all, listen to feedback from their own customers. Then this is not chic, to say the least. In the end, it is primarily a sign of weakness because if you have to take down the competition, it means that you have too few (re)inventions of your own. It also confirms the pressure AWS is putting on itself. The company wants to remain the market leader at all costs. In Europe, AWS already lost that position in many countries, and Microsoft took over. We don’t see that changing anytime soon, especially with the rise of generative AI.
AWS can’t handle generative AI competition
Re:Invent 23 was the moment for AWS to show how well they can innovate. To show that they are the leader in AI. Unfortunately, that is not the case. At Google and Microsoft, the rise of generative AI has been received as a godsend. They have taken immediate action and are broadly applying the technology in their solutions, including toward end users. For example, Google Cloud and Google Workspace are getting Duet AI. Microsoft is betting on Copilot for Microsoft 365, Dynamics and Github. In search engines, Bing Chat and Google Bard appear. AWS is the big loser here. The company cannot bring the technology to a large audience as quickly and is less likely to gain experience and expertise. At Amazon they simply do not have such solutions.
Amazon Q must compete with Duet AI, Microsoft Copilot and ChatGPT
Amazon Q must become a competitor to those well-known generative AI solutions. However, where you can simply activate Duet AI and Copilot with a checkmark within the SaaS solution, you must roll out Amazon Q yourself in your organization. For starters, you’ll have to start linking all the company data to Q yourself, which immediately creates a huge hurdle.
Connecting all kinds of data sources to a solution like Q doesn’t need to be very complex. With other generative AI solutions, you can make extend the LLMs with your own data. However, one must also consider compliance and governance, which makes it much more complex. Simply linking an ERP, HCM, Finance, or CRM system to such a solution can cause data within an organization to become far too widely available. In that respect, an overarching generative AI bot sounds mostly nice in theory. In practice, it is probably much safer to rely on the generative AI solutions that large ERP and CRM players will develop themselves within their solutions. Those are also likely to always be superior in terms of capabilities because they know their own software much better and can also perform actions.
Amazon Bedrock is already 127 – 4 behind
With machine learning and Amazon SageMaker, AWS has a strong product. But for generative AI it has not succeeded. Whereas AWS normally offers a lot of choice, as described earlier, perhaps too much, it lags far behind with generative AI. Amazon Bedrock offers 20 LLMs in U.S. regions but only 4 in Europe. Google, with Vertex AI, currently offers 127 LLMs, many of which are also already specialized and offer far more capabilities. The data at Google can be stored in 10 countries, including many European; the Netherlands, France, Germany, Belgium and the United Kingdom. So, actually, Bedrock is 127 – 4 behind in Europe.
AWS needs to re:Invent itself
If AWS wants to remain the market leader, it must reposition itself. It can’t just keep focusing on that technical persona. It must also develop a more straightforward business story that better aligns with its portfolio. If AWS doesn’t do that, the pressure is going to intensify and the competition will overtake them.
We don’t think AWS will do things substantially differently; the company and the technical people who work at AWS ultimately thrive best in such a technical community. In doing so, AWS can create hundreds of announcements every year around re:Invent. However, it does mean that management will also have to come to terms with that and that Microsoft will eventually become the bigger cloud, and maybe also Google. AWS won’t be the biggest cloud player, but it will remain the cloud player with the biggest portfolio and have the most freedom for developers. Hopefully, it will also behave a little more chic again. Next year there will be a new AWS re:Invent.