The second one.
It was Sam Altman himself who claimed that 2024 would be "THE" year for AI technology as a whole. It's March, and it seems if we ended the year right now, that statement would be true. If we include the general adoption of OpenAI's GPT-4 (which came out in November but saw a sharp rise in usage in January) as a part of 2024, then we would have received major upgrades for OpenAI's GPT-4, Google's Gemini, Anthropic's Claude, X's Groq, and more. In previous years, ONE of these announcements would have been industry-altering, and instead, we've had 4 inside of 4 months. It's been fascinating to see how the world, in particular, government bodies and the public, are adjusting.
In early March, India announced that it would have an advisory board sit over the country's tech ecosystem and be the determiners of whether new LLMs could be released to the public or not. Then, after receiving massive backlash (some from major VC partners), the country removed the requirement. On March 13, the EU announced the approval of the AI Act (with an overwhelming majority: 523 votes in favor, 46 against, and 49 abstentions) which would put rules around how AI could be used and what would be deemed legal use vs illegal uses of development. The US has had many governing conversations about how to structure or oversee AI, but no major laws have been passed or rules have been set.
Those in favor of regulation want it because they say that AI can be dangerous when misused or misleading and inaccurate, for example, using AI-generated videos to produce life-like videos of politicians in a malicious way or using AI voice generation to impersonate people on the phone. However, those who are against regulation cite that any sort of regulation will stifle growth and innovation in an extremely competitive landscape where being a few weeks ahead of the competition can lead to massive market share differentials. This is not too different from the battle with Cryptocurrency, which has taken several years but now is a part of the common lexicon and purchasing portfolio of retail and professional investors alike. I personally can understand both sides of this argument, but I think the general public wants to see how AI will tangibly be used before choosing a side. AI differs from Cryptocurrency because (in general) we have yet to see monetary improvements or disadvantages in our own wallets. The impact has mainly been felt by company executives where they've reduced headcount at their company or tech employees who utilize efficiency tooling to reduce their workload. Certainly, there are a large number of tech companies popping up that have AI at the core of their SaaS, but I think it's safe to say that for the majority of the population outside of SF, NYC, SEA, and LA - the impacts of what AI will do to their lives hasn't really hit home. For this large group, ChatGPT is fun to play with and helps craft better emails, but they're yet to see the impact on their lives. The buzz is undeniable (view graph below) but the monetary impacts don’t quite add up…yet.
To continue with the Cryptocurrency comparison, my belief is that it will take a Bitcoin-like moment for the greater population of America to realize how AI will impact them and then take a stance on regulation (either for or against). Hopefully, the event(s) that cause this will be positive like Bitcoin (i.e., an AI service that saves millions of people time or money) but it could be a negative one (i.e., an AI bot that scams millions of people of time or money or takes their jobs). I think Sam is right in that this moment is fast approaching and likely to happen in 2024.
As a reminder, for companies who are trying to figure out how AI can impact their business, that is why Coalesce is here. The truth is, it can be extremely impactful if it's implemented in the right use cases, and the use cases are specific to your company and your goals. That is why the partnership model works. Everyone will have an AI strategy in the next few years; the ones to do it first will likely quickly outpace the ones who do it later on.