Go to the project and Lunch the studio
Go to the project and Lunch the studio
Click on the Prompt Flow
Create a new standard flow
Put the name and create it
Delete the prebuilt components - echo and joke
Add a LLM component and name it as GPT_Call
Enter the details
Modify the system prompt
Change the input variable name from topic to user_input and start compute session
Once the compute session will be started, click Validate user input
Set the GPT call input
Change the output
Run the flow
Click on View output
Check the output
Save the flow
Deployment of Prompt Flow
Click on Deploy
Change the endpoint name and Next
Keep the below configuration and Next
Please try to follow the steps if you get any error and register the services in your subscription.
Steps -
Go to Azure Portal
Select your Subscription
Select Settings
Select Resource Providers
Make sure you add Microsoft.PolicyInsights and Microsoft.Cdn policies by selecting them and clicking the register button.
Refresh and retry the scenario from ML studio after some time.
Now check the Model+ Endpoint for the new chat endpoint
Go inside the chat model and test it
Go to Consume and note the endpoint and the key
Test from the command prompt using the command below
curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer " -d '{"user_query"}'