This solution provides an example of how to process your own documents and then use Microsoft Foundry and Microsoft Agent Framework to ask question specific to that document.
NOTE: The console app is also provided to demonstrate how to use the AI Foundry and the Agent Framework to ask questions of an AI Agent.
This solution consists of:
- A console app to easily run and test locally with the following commands:
- process - to process a file through document intelligence, then create embeddings and add the file to Azure AI Search
- doc - to set the active document you want to ask questions about
- ask - to ask you questions
- The deployment script can create a new Azure OpenAI Service for you however if you want to reuse an existing one, it will need to be in the same subscription where you are going to deploy your solution and retrieve its
Endpointand aKey. - The PowerShell deployment script defaults to
gpt-5-miniandtext-embedding-3-largemodels with a deployment name matching the model name. If you have something different in your Azure OpenAI instance, you will want to pass in those values to the PowerShell command line deployed to your Azure OpenAI instance each with a deployment name matching the model name. Be aware, that using a different GPT model may result in max token violations with the example below.
Deployment is automated using PowerShell, the Azure CLI and the Azure Developer CLI. These can be easily installed on a Windows machine using winget:
winget install --id "Microsoft.AzureCLI" --silent --accept-package-agreements --accept-source-agreements
winget install --id "Microsoft.Azd" --silent --accept-package-agreements --accept-source-agreementsNOTE: Since you will be deploying a new Azure OpenAI instance, be aware there are location limitations base on model. Please set your location value accordingly: Region Availability
Also, depending on your availble Azure OpenAI model quota, you may get a capacity related deployment error. If you do, you will need to modify the capacity value for the appropriate model found in the infra/azureopenai.bicep file
# Login to the Azure Developer CLI
azd auth login
#if you have access to multiple tenants, you may want to specify the tenant id
azd auth login --tenant-id "<tenant guid>"
# provision the resources
azd up
#follow the prompts for the parameter values...If successful, this process will create:
- Storage account with two blob containers (
rawfor uploaded documents andextractedfor processed output) - A Microsoft Foundry resource and project, with a
gpt-5-miniandtext_embedding_3_largedeployments and a system assigned managed identiy- Role assigment for Cognitive Services identity for read access to
rawcontainer and write access toextractedcontainer
- Role assigment for Cognitive Services identity for read access to
- Azure Cognitive Search account
- Azure Document Intelligence Account
- Azure Application Insights resource automatically connected to the Microsoft Foundry project for telemetry and monitoring
Along with the Azure deployment, the azd command will configure the local.settings.json file for the console app and the local funciton. To run the console app:
dotnet run --project ./DocumentQuestionsConsole/DocumentQuestionsConsole.csproj-
If this is your first time running the app or the Functions, you will not have any documents processed and you will be prompted to upload a document with the
process -
Upload a document using the
processcommand -
Set the current document using the
doccommand -
Start ask questions!
Try uploading your own documents and start asking question







