Close

Sign up to Gadget

Use the latest OpenAI models in your next Gadget build

-

About

Build your AI apps in Gadget using the new GPT-4 Turbo, Dall-E 3, the Assistants API, and GPT-4 Turbo with Vision

Problem

Solution

Result

Use the latest OpenAI models in your next Gadget build

Emma Hyde
November 28, 2023

TLDR; You can build using the latest OpenAI models in your Gadget apps.

OpenAI recently announced dozens of new additions and improvements to their platform. We’re most excited about the updated world knowledge, but they’ve also promised better context length, more control and reliability, and better pricing. Developers building AI apps with Gadget have access to the new models, including <inline-code>gpt-4-1106-preview<inline-code>, <inline-code>gpt-4-vision-preview<inline-code>, <inline-code>dall-e-3<inline-code>, and the new Assistants API.

To help you get started building AI apps using Gadget, you get a $50 OpenAI credit through the Gadget-provided OpenAI credentials, which you can use to try out all the new models. Once you are out of credit, you will need to be on the GPT Plus plan in order to use these models.

What’s new in Gadget?

Support for GPT-4 Turbo

For those of you eager to get your hands on the latest GPT-4 Turbo, we’ve got you covered. The <inline-code>gpt-4-1106-preview<inline-code> model is the next improvement on GPT-4, giving you a more capable, more knowledgeable, and a more affordable (per token) way to use LLMs.

Support for GPT-4 Turbo with Vision

The main differentiator between GPT-4 Turbo and the <inline-code>gpt-4-vision-preview<inline-code> model is its ability to process and understand image contents — including reading documents with figures. For Shopify app developers, this could be a powerful tool when it comes to working with product images, or reports about your store.

Support for Dall-E 3

If you’re looking to build apps that will generate images and designs, you’ll be happy to hear that we’ve also added support for Dall-E 3. Just like the previous version, <inline-code>dall-e-3<inline-code> comes with built-in moderation tools to ensure the apps you build are safe and compliant.

Support for the Assistants API

You can now build AI Assistants within your Gadget apps. OpenAI introduced a new API that allows you to use the GPT models to create new flows and automations within your own applications. For example, you could use it to allow merchants to ask, using natural language, to "generate a report at 3pm on Saturday". The Assistant could run a Shopify sync on one or more models at a specific time, and produce a report for the synced data.

If you’re looking to build with the new Assistants API, it will be tied to your OpenAI organization. Make sure to remove the Gadget-provided credentials when you start building and replace them with your own. 

If you’re new to Gadget and want to see these models in action, try our quickstart guide!

So what has OpenAI changed?

Better context length

Previously, GPT-4 supported context lengths up to 8k, and sometimes up to 32k. But the community consistently shared that it wasn’t enough. With GPT-4 Turbo, OpenAI is giving users up to 128k tokens of context. According to Sam Altman, that equates to approximately 300 pages of a book. But on top of more context, they also promise that the output will be even more accurate than before.

What is context length?
A model's context length is the maximum number of tokens that can be used for input and output. For example, if a model has a context length of 1024, then the sum of your prompt and output response can be a maximum of 1024 tokens long. Tokens are not the same as words, and the number of tokens in a sentence can vary depending on the model. Learn more about context length 

More control and reliability

Ever since the GPT models became available, developers have been asking for more control over responses and outputs. With the new models, we’re getting an all-new “JSON mode” which ensures the model will respond with valid JSON. The hope is that calling APIs and functions will be much easier now. You can see the <inline-code>response_format<inline-code> parameter on the Chat Completions API for an example.

Better world knowledge

Up until now, one of the biggest challenges for many developers was having to provide additional context (and using up a lot of tokens) to help OpenAI generate an accurate answer, because it was only trained on information up until September 2021. With the recent changes, OpenAI has updated its  <inline-code>gpt-4-1106-preview<inline-code> and <inline-code>gpt-4-vision-preview<inline-code> models to have knowledge all the way up until April 2023. And this time, they promise they won’t let it get that out of date again.

See it in action

The Gadget founders sat down to play around with the new models and build an app that generates product descriptions from only an image, using the latest OpenAI models and Gadget.

Keep reading

No items found.
Keep reading to learn about how it's built

Under the hood

We're on Discord, drop in and say hi!
Join Discord
Bouncing Arrow