[ad_1]
Welcome again to this sequence the place we’re finding out the best way to combine AI merchandise into internet packages.
- Intro & Setup
- Your First AI Advised
- Streaming Responses
- How Does AI Paintings
- Advised Engineering
- AI-Generated Photographs
- Safety & Reliability
- Deploying
Final time, we were given all of the boilerplate determine of the best way.
On this publish, we’ll discover ways to combine OpenAI’s API responses into our Qwik app the use of fetch
. We’ll wish to be sure we’re now not leaking API keys through executing those HTTP requests from a backend.
Through the tip of this publish, we can have a rudimentary, however operating AI software.
Generate OpenAI API Key
Sooner than we commence construction the rest, you’ll wish to pass to platform.openai.com/account/api-keys and generate an API key to make use of for your software.

Make sure you make a copy of it someplace as a result of you’re going to most effective be capable to see it as soon as.
Together with your API key, you’ll be capable to make authenticated HTTP requests to OpenAI. So it’s a good suggestion to get acquainted with the API itself. I’d inspire you to take a short lived glance during the OpenAI Documentation and turn into acquainted with some ideas. The fashions are in particular excellent to grasp as a result of they have got various features.
If you want to get yourself up to speed with the API endpoints, anticipated payloads, and go back values, take a look at the OpenAI API Reference. It additionally accommodates useful examples.
You could realize the JavaScript package deal to be had on NPM referred to as openai
. We will be able to now not be the use of this, because it doesn’t fairly improve some issues we’ll wish to do, that fetch
can.
Make Your First HTTP Request
The appliance we’re going to construct will make an AI-generated textual content crowning glory in keeping with the consumer enter. For that, we’ll wish to paintings with the chat endpoint (be aware that the completions endpoint is deprecated).
We wish to make a POST
request to https://api.openai.com/v1/chat/completions with the 'Content material-Kind'
header set to 'software/json'
, the 'Authorization'
set to 'Bearer OPENAI_API_KEY'
(you’ll wish to exchange OPENAI_API_KEY together with your API key), and the frame
set to a JSON string containing the GPT type to make use of (we’ll use gpt-3.5-turbo
) and an array of messages:
fetch('https://api.openai.com/v1/chat/completions', {
way: 'POST',
headers: {
'Content material-Kind': 'software/json',
'Authorization': 'Bearer OPENAI_API_KEY'
},
frame: JSON.stringify({
'type': 'gpt-3.5-turbo',
'messages': [
{
'role': 'user',
'content': 'Tell me a funny joke'
}
]
})
})
You’ll be able to run this proper out of your browser console and notice the request within the Community tab of your dev gear.
The reaction will have to be a JSON object with a host of houses, however the only we’re maximum all in favour of is the "possible choices"
. It’s going to be an array of textual content completions gadgets. The primary one will have to be an object with a "message"
object that has a "content material"
belongings with the chat crowning glory.
{
"identity": "chatcmpl-7q63Hd9pCPxY3H4pW67f1BPSmJs2u",
"object": "chat.crowning glory",
"created": 1692650675,
"type": "gpt-3.5-turbo-0613",
"possible choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Why don't scientists trust atoms?nnBecause they make up everything!"
},
"finish_reason": "stop"
}
],
"utilization": {
"prompt_tokens": 12,
"completion_tokens": 13,
"total_tokens": 25
}
}
Congrats! Now you’ll be able to request a mediocre shaggy dog story every time you wish to have.
Construct the Shape
The fetch
request above is okay, but it surely’s now not fairly an software. What we wish is one thing a consumer can engage with to generate an HTTP request like the only above.
For that, we’ll almost certainly need some type to begin with an HTML <shape>
containing a <textarea>
. Underneath is the minimal markup we want, and if you wish to be informed extra, believe studying those articles:
<shape>
<label for="steered">Advised</label>
<textarea identity="steered" title="steered"></textarea>
<button>Inform me</button>
</shape>
We will be able to reproduction and paste this manner proper within our Qwik part’s JSX template. For those who’ve labored with JSX up to now, you can be used to changing the for
characteristic at the <label>
with htmlFor
, however Qwik’s compiler in reality doesn’t require us to do this, so it’s positive as is.
Subsequent, we’ll wish to exchange the default shape submission habits. Through default, when an HTML shape is submitted, the browser will create an HTTP request through loading the URL supplied within the shape’s motion
characteristic. If none is equipped, it’s going to use the present URL. We wish to steer clear of this web page load and use JavaScript as a substitute.
For those who’ve performed this earlier than, you can be acquainted with the preventDefault
way at the Match interface. Because the title suggests, it prevents the default habits for the development.
There’s a problem right here because of how Qwik offers with match handlers. Not like different frameworks, Qwik does now not obtain all of the JavaScript common sense for the applying upon first web page load. As an alternative, it has an overly skinny consumer that intercepts consumer interactions and downloads the JavaScript match handlers on-demand.
This asynchronous nature makes Qwik packages a lot sooner to load, however introduces a problem of coping with match handlers asynchronously. It makes it not possible to forestall the default habits the similar manner as synchronous match handlers which are downloaded and parsed earlier than the consumer interactions.
Thankfully, Qwik supplies a solution to save you the default habits through including preventdefault:{eventName}
to the HTML tag. An excessively fundamental shape instance might glance one thing like this:
import { part$ } from '@builder.io/qwik';
export default part$(() => {
go back (
<shape
preventdefault:publish
onSubmit$={(match) => {
console.log(match)
}}
>
<!-- shape contents -->
</shape>
)
})
Did you realize that little $
on the finish of the onSubmit$
handler, there? Stay an eye fixed out for the ones, as a result of they’re typically a touch to the developer that Qwik’s compiler goes to do one thing humorous and turn into the code. On this case, it’s because of that lazy-loading match dealing with gadget I discussed above. For those who plan on operating with Qwik extra, it’s value studying extra about that right here.
Incorporate the Fetch Request
Now we’ve the gear in position to switch the default shape submission with the fetch request we created above.
What we wish to do subsequent is pull the knowledge from the <textarea>
into the frame of the fetch request. We will be able to achieve this with FormData
, which expects a kind part as an issue and offers an API to get entry to a kind keep watch over values during the keep watch over’s title
characteristic.
We will be able to get entry to the shape part from the development’s goal
belongings, use it to create a brand new FormData
object, and use that to get the <textarea>
worth through referencing its title
, “steered”. Plug that into the frame of the fetch request we wrote above, and it’s possible you’ll get one thing that appears like this:
export default part$(() => {
go back (
<shape
preventdefault:publish
onSubmit$={(match) => {
const shape = match.goal
const formData = new FormData(shape)
const steered = formData.get('steered')
const frame = {
'type': 'gpt-3.5-turbo',
'messages': [{ 'role': 'user', 'content': prompt }]
}
fetch('https://api.openai.com/v1/chat/completions', {
way: 'POST',
headers: {
'Content material-Kind': 'software/json',
'Authorization': 'Bearer OPENAI_API_KEY'
},
frame: JSON.stringify(frame)
})
}}
>
<!-- shape contents -->
</shape>
)
})
In concept, you will have to now have a kind to your web page that, when submitted, sends the price from the textarea to the OpenAI API.
Offer protection to Your API Keys
Even supposing our HTTP request is operating, there’s a evident factor. As it’s being built at the consumer facet, any individual can open the browser dev gear and investigate cross-check the houses of the request. This contains the Authorization
header containing our API keys.

This is able to permit anyone to thieve our API tokens and make requests on our behalf, which might result in abuse or upper fees on our account.
Now not excellent!!!
One of the simplest ways to forestall that is to transport this API name to a backend server that we keep watch over that may paintings as a proxy. The frontend could make an unauthenticated request to the backend, and the backend would make the authenticated request to OpenAI and go back the reaction to the frontend. However as a result of customers can’t investigate cross-check backend processes, they wouldn’t be capable to see the Authentication header.
So how can we transfer the fetch request to the backend?
I’m so satisfied you requested!
We’ve been most commonly specializing in construction the frontend with Qwik, the framework, however we even have get entry to to make use of Qwik Town, the full-stack meta-framework with tooling for file-based routing, course middleware, HTTP endpoints, and extra.
Of the quite a lot of choices Qwik Town provides for working backend common sense, my favourite is routeAction$
. It permits us to create a backend serve as that may be brought about from the customer over HTTP (necessarily an RPC endpoint).
The common sense would apply:
- Use
routeAction$()
to create an motion. - Give you the backend common sense because the parameter.
- Programmatically execute the motion’s
publish()
way.
A simplified instance might be:
import { part$ } from '@builder.io/qwik';
import { routeAction$ } from '@builder.io/qwik-city';
export const useAction = routeAction$((params) => {
console.log('motion at the server', params)
go back { o: 'okay' }
})
export default part$(() => {
const motion = useAction()
go back (
<shape
preventdefault:publish
onSubmit$={(match) => {
motion.publish('knowledge')
}}
>
<!-- shape contents -->
</shape>
{ JSON.stringify(motion) }
)
})
I incorporated a JSON.stringify(motion)
on the finish of the template as a result of I believe you will have to see what the returned ActionStore
looks as if. It accommodates further data like whether or not the motion is working, what the submission values had been, what the reaction standing is, what the returned worth is, and extra.
That is all very helpful knowledge that we get out of the field simply by the use of an motion, and it permits us to create extra tough packages with much less paintings.
Fortify the Revel in
Qwik Town movements are cool, however they get even higher when blended with Qwik’s <Shape>
part:
Below the hood, the part makes use of a local HTML part, so it’s going to paintings with out JavaScript.
When JS is enabled, the part will intercept the shape submission and cause the motion in SPA mode, permitting to have a complete SPA revel in.
Through changing the HTML <shape>
part with Qwik’s <Shape>
part, we now not need to arrange preventdefault:publish
, onSubmit$
, or name motion.publish()
. We will be able to simply move the motion to the Shape
‘s motion
prop, and it’ll deal with the paintings for us. Moreover, it’s going to paintings if JavaScript isn’t to be had for some reason why (we can have performed this with the HTML model as neatly, however it might had been extra paintings).
import { part$ } from '@builder.io/qwik';
import { routeAction$, Shape } from '@builder.io/qwik-city';
export const useAction = routeAction$(() => {
console.log('motion at the server')
go back { o: 'okay' }
});
export default part$(() => {
const motion = useAction()
go back (
<Shape motion={motion}>
<!-- shape contents -->
</Shape>
)
})
In order that’s an development for the developer revel in. Let’s additionally enhance the consumer revel in.
Inside the ActionStore
, we’ve get entry to to the isRunning
knowledge which helps to keep observe of whether or not the request is pending or now not. It’s to hand data we will use to let the consumer know when the request is in flight.
We will be able to achieve this through enhancing the textual content of the publish button to mention “Inform me” when it’s idle, then “One sec…” whilst it’s loading. I additionally love to assign the aria-disabled
characteristic to compare the isRunning
state. This may increasingly trace to assistive generation that it’s now not in a position to be clicked (even though technically nonetheless will also be). It will also be focused with CSS to supply visible types suggesting it’s now not fairly in a position to be clicked once more.
<button kind="publish" aria-disabled={state.isLoading}>
{state.isLoading ? 'One sec...' : 'Inform me'}
</button>
Display the Effects
Good enough, we’ve performed manner an excessive amount of paintings with out in reality seeing the consequences at the web page. It’s time to switch that. Let’s deliver the fetch
request we prototyped previous within the browser into our software.
We will be able to reproduction/paste the fetch
code proper into the frame of our motion handler, however to get entry to the consumer’s enter knowledge, we’ll want get entry to to the shape knowledge this is submitted. Thankfully, any knowledge handed to the motion.publish()
way can be to be had to the motion handler as the primary parameter. It’s going to be a serialized object the place the keys correspond to the shape keep watch over names.
Observe that I’ll be the use of the watch for
key phrase within the frame of the handler, this means that I additionally need to tag the handler as an async
serve as.
import { part$ } from '@builder.io/qwik';
import { routeAction$, Shape } from '@builder.io/qwik-city';
export const useAction = routeAction$(async (formData) => {
const steered = formData.steered // From <textarea title="steered">
const frame = {
'type': 'gpt-3.5-turbo',
'messages': [{ 'role': 'user', 'content': prompt }]
}
const reaction = watch for fetch('https://api.openai.com/v1/chat/completions', {
way: 'POST',
headers: {
'Content material-Kind': 'software/json',
'Authorization': 'Bearer OPENAI_API_KEY'
},
frame: JSON.stringify(frame)
})
const knowledge = watch for reaction.json()
go back knowledge.possible choices[0].message.content material
})
On the finish of the motion handler, we additionally wish to go back some knowledge for the frontend. The OpenAI reaction comes again as JSON, however I believe we may as neatly simply go back the textual content. For those who take into account from the reaction object we noticed above, that knowledge is positioned at responseBody.possible choices[0].message.content material
.
If we set issues up as it should be, we will have to be capable to get entry to the motion handler’s reaction within the ActionStore
‘s worth
belongings. This implies we will conditionally render it someplace within the template like so:
{motion.worth && (
<p>{motion.worth}</p>
)}
Use Atmosphere Variables
Alright, we’ve moved the OpenAI request to the backend, secure our API keys from prying eyes, we’re getting a (mediocre shaggy dog story) reaction, and exhibiting it at the frontend. The app is operating, however there’s nonetheless another safety factor to maintain.
It’s usually a foul concept to tough code API keys into your supply code, for quite a few causes:
- It way you’ll be able to’t proportion the repo publicly with out exposing your keys.
- You could run up API utilization all over construction, trying out, and staging.
- Converting API keys calls for code adjustments and re-deploys.
- You’ll wish to regenerate API keys anytime anyone leaves the org.
A greater gadget is to make use of atmosphere variables. With atmosphere variables, you’ll be able to give you the API keys most effective to the methods and customers that want get entry to to them.
For instance, you’ll be able to make an atmosphere variable referred to as OPENAI_API_KEY
with the price of your OpenAI key for most effective the manufacturing atmosphere. This manner, most effective builders with direct get entry to to that atmosphere would be capable to get entry to it. This very much reduces the possibility of the API keys leaking, it makes it more straightforward to proportion your code brazenly, and since you are restricting get entry to to the keys to the least collection of other people, you don’t wish to exchange keys as incessantly as a result of anyone left the corporate.
In Node.js, it’s not unusual to set atmosphere variables from the command line (ENV_VAR=instance npm get started
) or with the preferred dotenv
package deal. Then, for your server-side code, you’ll be able to get entry to atmosphere variables the use of procedure.env.ENV_VAR
.
Issues paintings fairly in a different way with Qwik.
Qwik can goal other JavaScript runtimes (now not simply Node), and getting access to atmosphere variables by means of procedure.env
is a Node-specific idea. To make issues extra runtime-agnostic, Qwik supplies get entry to to atmosphere variables thru a RequestEvent
object which is to be had as the second one parameter to the course motion handler serve as.
import { routeAction$ } from '@builder.io/qwik-city';
export const useAction = routeAction$((param, requestEvent) => {
const envVariableValue = requestEvent.env.get('ENV_VARIABLE_NAME')
console.log(envVariableValue)
go back {}
})
In order that’s how we get entry to atmosphere variables, however how can we set them?
Sadly, for manufacturing environments, atmosphere atmosphere variables will range relying at the platform. For the standard server VPS, you’ll be able to nonetheless set them with the terminal as you possibly can in Node (ENV_VAR=instance npm get started
).
In construction, we will on the other hand create a native.env
dossier containing the environment variables, and they’re going to be mechanically assigned for us. That is handy since we spend much more time beginning the advance atmosphere, and it way we will give you the suitable API keys most effective to the individuals who want them.
So after you create a native.env
dossier, you’ll be able to assign the OPENAI_API_KEY
variable on your API key.
OPENAI_API_KEY="your-api-key"
(You could wish to restart your dev server)
Then we will get entry to the surroundings variable during the RequestEvent
parameter. With that, we will exchange the hard-coded worth in our fetch
request’s Authorization header with the variable the use of Template Literals.
export const usePromptAction = routeAction$(async (formData, requestEvent) => {
const OPENAI_API_KEY = requestEvent.env.get('OPENAI_API_KEY')
const steered = formData.steered
const frame = {
type: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: prompt }]
}
const reaction = watch for fetch('https://api.openai.com/v1/chat/completions', {
way: 'publish',
headers: {
'Content material-Kind': 'software/json',
Authorization: `Bearer ${OPENAI_API_KEY}`,
},
frame: JSON.stringify(frame)
})
const knowledge = watch for reaction.json()
go back knowledge.possible choices[0].message.content material
})
For extra main points on atmosphere variables in Qwik, see their documentation.
Recap
- When a consumer submits the shape, the default habits is intercepted through Qwik’s optimizer which lazy so much the development handler.
- The development handler makes use of JavaScript to create an HTTP request containing the shape knowledge to ship to the server to be treated through the course’s motion.
- The course’s motion handler can have get entry to to the shape knowledge within the first parameter and will get entry to atmosphere variables from the second one parameter (a
RequestEvent
object). - Within the course’s motion handler, we will assemble and ship the HTTP request to OpenAI the use of the knowledge we were given from the shape, and the API keys we pulled from the surroundings variables.
- With the OpenAI reaction, we will get ready the knowledge to ship again to the customer.
- The buyer receives the reaction from the motion and will replace the web page accordingly.
Right here’s what my ultimate part looks as if, together with some Tailwind categories and a fairly other template.
import { part$ } from "@builder.io/qwik";
import { routeAction$, Shape } from "@builder.io/qwik-city";
export const usePromptAction = routeAction$(async (formData, requestEvent) => {
const OPENAI_API_KEY = requestEvent.env.get('OPENAI_API_KEY')
const steered = formData.steered
const frame = {
type: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: prompt }]
}
const reaction = watch for fetch('https://api.openai.com/v1/chat/completions', {
way: 'publish',
headers: {
'Content material-Kind': 'software/json',
Authorization: `Bearer ${OPENAI_API_KEY}`,
},
frame: JSON.stringify(frame)
})
const knowledge = watch for reaction.json()
go back knowledge.possible choices[0].message.content material
})
export default part$(() => {
const motion = usePromptAction()
go back (
<major magnificence="max-w-4xl mx-auto p-4">
<h1 magnificence="text-4xl">Hello 👋</h1>
<Shape motion={motion} magnificence="grid gap-4">
<div>
<label for="steered">Advised</label>
<textarea title="steered" identity="steered">
Inform me a shaggy dog story
</textarea>
</div>
<div>
<button kind="publish" aria-disabled={motion.isRunning}>
{motion.isRunning ? 'One sec...' : 'Inform me'}
</button>
</div>
</Shape>
{motion.worth && (
<article magnificence="mt-4 border border-2 rounded-lg p-4 bg-[canvas]">
<p>{motion.worth}</p>
</article>
)}
</major>
);
});
Conclusion
All proper! We’ve long gone from a script that makes use of AI to get mediocre jokes to a full-blown software that securely makes HTTP requests to a backend that makes use of AI to get mediocre jokes and sends them again to the frontend to position the ones mediocre jokes on a web page.
You will have to really feel lovely excellent about your self.
However now not too excellent, as a result of there’s nonetheless room to enhance.
In our software, we’re sending a request and getting an AI reaction, however we’re looking forward to everything of the frame of that reaction to be generated earlier than appearing it to the customers. And those AI responses can take some time to finish.
For those who’ve used AI chat gear up to now, you can be acquainted with the revel in the place it appears adore it’s typing the responses to you, one phrase at a time, as they’re being generated. This doesn’t accelerate the entire request time, but it surely does get some data again to the consumer a lot quicker and looks like a sooner revel in.
Within the subsequent publish, we’ll discover ways to construct that very same characteristic the use of HTTP streams, which might be interesting and robust but in addition will also be roughly complicated. So I’m going to devote a whole publish simply to that.
I am hoping you’re playing this sequence and plan to stay round. Within the period in-between, have a laugh producing some mediocre jokes.
- Intro & Setup
- Your First AI Advised
- Streaming Responses
- How Does AI Paintings
- Advised Engineering
- AI-Generated Photographs
- Safety & Reliability
- Deploying
Thanks such a lot for studying. For those who preferred this text, and wish to improve me, the most efficient techniques to take action are to proportion it, join my publication, and apply me on Twitter.
Initially revealed on austingil.com.
[ad_2]