With over 100 million active users/month as of December, ChatGPT, from San Francisco startup OpenAI, is a massive success. ChatGPT's popularity has resulted in significant media coverage and numerous social media memes. Additionally, it has been used to create hundreds of e-books available in Amazon's Kindle store. Furthermore, ChatGPT is credited with having co-authored at least one scientific paper. However, as OpenAI is a business, it has to find a way to monetize ChatGPT to satisfy its investors. A step in the right direction was the debut of ChatGPT Plus, a premium service, in February. Chat GPT API enables companies to integrate ChatGPT technology into their applications, websites, products, and services. According to Greg Brockman, OpenAI's President, Chairman, and co-founder, introducing an API was always planned. Before the Chat GPT API launch, he mentioned in a video call that they needed time to develop the APIs to meet specific quality standards. It's a matter of simply being able to handle the scale and the demand. Enhancements Brockman explained that the Chat GPT API operates on the same AI model as OpenAI's highly successful ChatGPT, called "gpt-3.5-turbo." GPT-3.5 is currently the most potent text-generating model OpenAI provides via its API suite. The term "turbo" refers to a more optimized and responsive version of GPT-3.5 that OpenAI has quietly been testing for ChatGPT. Chat GPT API Brockman asserts that early adopters such as Snap, Quizlet, Instacart, and Shopify have adopted the API, priced at $0.002 per 1,000 tokens, equivalent to roughly 750 words, and can facilitate various experiences, including "non-chat" applications. The primary motivation behind the development of gpt-3.5-turbo may have been to reduce ChatGPT's high computing costs. OpenAI CEO Sam Altman referred to ChatGPT's expenditures as "eye-watering," estimating that they amounted to a few cents per chat in computing costs. (Given the enormous number of users, these costs presumably accumulate rapidly.) According to Brockman, get-3.5-turbo has also been enhanced in other ways. For example, Brockman explains that when developing an AI-powered tutor, the tutor should provide explanations to the student rather than simply giving them the answer. The API's usability and accessibility will improve with the kind of system that developers can create using it. Applications The Chat GPT API has already been the foundation for several features, including My AI, a chatbot for Snapchat+ subscribers recently introduced by Snap, and Quizlet's Q-Chat virtual tutor. Shopify created a customized shopping assistant using the ChatGPT API. At the same time, Instacart's upcoming tool, Ask Instacart, allows customers to ask questions about food and receive "shoppable" answers informed by by-product data from the company's retail partners. Zhuang, Instacart's Chief Architect, believes grocery shopping is mentally taxing, considering factors such as budget, health, preferences, and more. He suggests AI could help manage this mental load and assist household leaders with grocery shopping, meal planning, and cooking. Additionally, what if grocery shopping could be a fun experience? The company can achieve this goal by integrating OpenAI's Chat.GPT with Instacart's AI system. Instacart is excited to explore the possibilities of what can be achieved within the Instacart app. For those closely following the ChatGPT journey, the question of whether it's ready for release is understandable. In the early days, ChatGPT's responses were often racist and sexist, reflecting the partial data. (The training data for ChatGPT includes a wide range of internet content, including e-books, Reddit posts, and Wikipedia articles.) Additionally, ChatGPT invents facts without disclosing that it's doing so, a phenomenon in AI referred to as hallucination. ChatGPT can be vulnerable to prompt-based attacks, leading it to perform unintended tasks. Some Reddit groups have emerged, exploring ways to bypass OpenAI's security measures. An incident involving a Scale AI employee resulted in ChatGPT revealing its internal technological operations. Challenges Even though potential controversy surrounding the Chat GPT API is unlikely to be desirable for brands, Brockman insists that it won't affect them. One reason for this is due to ongoing enhancements made to the backend of the API, which sometimes comes at the expense of Kenyan contract workers. Additionally, OpenAI has introduced a new approach called Chat Markup Language (ChatML), which is much less controversial. ChatML sends text to the Chat GPT API as a sequence of messages with metadata, as opposed to the standard ChatGPT, which accepts raw text as a series of tokens. By appending instructions to the prompt, such as "You are a bot," developers can tailor and filter the ChatGPT's responses to better meet their needs. According to Brockman, OpenAI is transitioning to a more advanced enterprise API. With the Chat Markup Language (ChatML), developers can represent input to the system in a structured manner, identifying whether it is from the user or the developer. This approach will make the system more robust against prompt attacks. To avoid unintended ChatGPT behavior, model updates will be more frequent. Brockman announced that OpenAI would automatically upgrade developers to the latest stable model, gpt-3.5-turbo-0301, with the launch of gpt-3.5-turbo. However, developers can opt to stay with an older model, which may reduce the benefits of the upgrade. Brockman notes that certain customers, particularly large companies with significant budgets, will have more control over system performance, regardless of whether developers update to the latest model. It is due to the new dedicated ChatGPT at capacity plans launched by OpenAI. Customers can purchase a compute infrastructure allocation to operate an OpenAI model, such as gpt-3.5-turbo. They will have complete control over the instance's load, usually shared among other users. With dedicated Chat GPT capacity, customers can also enable features like more extended context limits. Context limits define the amount of text a model considers before generating new text, and increasing them improves the model's ability to recall text. While not a complete solution to bias and toxicity, larger context limits could reduce hallucinations in models such as gpt-3.5-turbo. It's worth noting that Azure powers the backend of OpenAI. This is what is OpenAI. Conclusion Brockman said that dedicated capacity customers could access gpt-3.5-turbo models with up to a 16k context window. It allows them to process four times as many tokens as the standard Chat GPT API model. It could enable tasks such as pasting in pages of tax code and obtaining reasonable answers from the model. Brockman stated that OpenAI has no immediate plans for a general feature release due to performance tradeoffs. He also mentioned that the company might eventually offer an on-demand version, which is not surprising given the pressure to profit after Microsoft's significant investment. Further Reading \t ChatGPT Stock to Make You Wealthier: Choose Among 7 Best Stocks \t Bank Crisis Is Easing as Late March FHLB Debt Issuance Declines \t Why are Wall Street Banks Cracking Down on ChatGPT?