AIPSTeam is a PowerShell-first AI collaboration script that simulates a small delivery team around your project idea or existing script. Instead of giving you one raw answer, it helps produce a more structured first draft: requirements framing, implementation guidance, PowerShell output, documentation, and review-style feedback. It works with Azure OpenAI, Ollama, or LM Studio and is best approached through one clear first-run path before exploring the wider feature set.
- Who this is for
- Why use AIPSTeam
- 5-minute quickstart
- What you get
- Example first-run output
- Representative example artifact
- Recommended parameters
- Supported environment
- Known limitations
- Advanced parameters
- Overview
- Retrieval-Augmented Generation (RAG)
- Installation notes
- Additional usage examples
- Developer notes
- Dependencies and prerequisites
- Troubleshooting
- FAQ
AIPSTeam is for:
- PowerShell builders who want more than a single one-shot prompt
- sysadmins and automation engineers who want AI-assisted project drafting
- people who want code, documentation, and review-style output in one workflow
AIPSTeam is not the best fit for:
- tiny one-line scripts
- users who do not have any LLM backend configured
- cases where fully production-ready output is expected from the first run without review
- Multi-role workflow: instead of one generic answer, the script simulates a team of specialists
- PowerShell-first output: focused on PowerShell project work rather than generic coding chatter
- Documentation and review included: useful when you want a first structured draft, not just raw code
- Flexible backend choice: Azure OpenAI, Ollama, or LM Studio
- Better first-pass thinking: useful when you want something closer to a draft delivery workflow than a single prompt completion
Install-Module -Name PSAOAI
Install-Module -Name PSScriptAnalyzer
Install-Module -Name powerHTMLInstall-Script AIPSTeamFor the smoothest first run, use the backend you already have working.
Supported providers:
PSAOAI module repo:
Important
You need Azure OpenAI, Ollama, or LM Studio to use this script.
This is the canonical first-run path for understanding what AIPSTeam does. It keeps the workflow non-interactive and disables RAG so you do not need search-provider setup for the first pass.
$prompt = @"
Create a PowerShell tool that checks local administrator membership,
exports results to CSV, and generates basic documentation.
"@
$prompt | AIPSTeam.ps1 -LLMProvider "AzureOpenAI" -NOInteraction -NORAG -Stream $falseIf you want the most completion-friendly path on heavier models, use the reduced workflow mode:
$prompt | AIPSTeam.ps1 -LLMProvider "AzureOpenAI" -NOInteraction -NORAG -ReducedWorkflow -Stream $falseIf you are using another backend, switch only the provider value:
-LLMProvider "ollama"-LLMProvider "LMStudio"
After a successful first run, you should expect output such as:
- a clarified project goal or requirements summary
- implementation ideas or proposed PowerShell structure
- generated PowerShell code or code fragments
- documentation draft content
- review-style or QA-style feedback
The exact result depends on the model quality, available context window, and the prompt you provide.
Important
The quality of the generated project depends significantly on the model used and the context window available. Better models and better context generally produce better drafts.
A realistic first run will not usually give you a polished production-ready module. What it should give you is a strong working draft, for example:
Requirements summary
- Check local administrator membership
- Export results to CSV
- Produce basic usage documentation
Proposed implementation
- Collect local group membership
- Normalize account names and types
- Export results with timestamp
- Add basic error handling and documentation
Draft deliverables
- PowerShell code skeleton
- suggested function layout
- documentation outline
- reviewer notes / next improvements
That is the value of AIPSTeam: a more structured project draft than a single one-shot reply.
For a fuller walkthrough, see:
That file shows a representative first-run scenario, the command used, the shape of the output, and what still needs human review.
These are the most useful parameters for a realistic first run:
userInput— project outline as a string; can also be piped-NOInteraction— run without prompts or menus during the session-ReducedWorkflow— run a smaller Manager + Developer flow that skips the heavier later stages-LLMProvider— choose the backend:AzureOpenAI,ollama, orLMStudio-NORAG— disable retrieval for a simpler first run-Stream $false— disable streaming for a cleaner, easier-to-review result-TheCodePath— work directly on an existing PowerShell script
Best-supported path
- Windows PowerShell / PowerShell-focused workflow
- one configured LLM backend only for the first run
- non-interactive first pass with
-NOInteraction -NORAG -Stream $false
Supported providers
- Azure OpenAI
- Ollama
- LM Studio
Practical guidance
- Start with the provider you already have working.
- Start without RAG for the first run.
- Treat Windows as the safest default path for now.
- If you are on WSL/Linux or a mixed environment, expect more setup friction and verify the PSAOAI/runtime path first.
- First-run output is a working draft, not a production-ready deliverable.
- Result quality depends heavily on the selected model and available context window.
- The smoothest path is to configure exactly one backend first and leave RAG off until the base flow works.
- PSAOAI/runtime environment issues can be the main source of friction, especially outside the most typical Windows-oriented setup.
- Advanced parameters are useful, but they are not the right starting point for understanding the tool.
Use these after the basic flow is already clear:
-Stream— enable or disable live streaming ($trueby default)-NOPM— disable Project Manager functions-ReducedWorkflow— keep only the initial Manager → Developer pass and skip the later multi-review stages-NODocumentator— disable Documentator functions-NOLog— disable logging-LogFolder— specify where logs should be stored-DeploymentChat— override the Azure OpenAI deployment setting-MaxTokens— control the length of generated responses-NOTips— disable tips-VerbosePrompt— show prompts-LoadProjectStatus— resume from a saved project state-NOUserInputCheck— disable the input check step
This PowerShell script simulates a team of AI agents working together on a PowerShell project. Each specialist has a role and contributes to the project in sequence. The script processes user input, performs different project tasks, and can generate outputs such as code, documentation, and analysis reports.
The main value of AIPSTeam is not just “generate some code,” but to move through a more structured multi-role flow that resembles requirements thinking, implementation, review, and documentation.
RAG combines retrieval and generation to produce more accurate and contextually relevant outputs.
How it works:
- Retrieval: the system fetches relevant external information.
- Generation: the LLM uses that information to produce better output.
By integrating these two phases, AIPSTeam can produce more informed responses than a prompt-only flow.
AIPSTeam uses web search providers as a source for retrieval. Supported services include:
- SerpApi — 100 free searches per month
- EXA — 1000 free searches per month
- Serper — 1000 free searches per month
Bing Web Search API— retired (retirement notice)
Current behavior:
- the script tries SerpApi first
- if that fails, it tries EXA
- if that fails, it tries Serper
- if all fail, the script continues without successful retrieval
For a simpler first experience, use -NORAG and come back to RAG later.
To configure external providers, set the needed environment variables before running the script.
PSAOAI_API_AZURE_OPENAI_KEYPSAOAI_API_AZURE_OPENAI_ENDPOINTPSAOAI_API_AZURE_OPENAI_APIVERSIONPSAOAI_API_AZURE_OPENAI_CC_DEPLOYMENTPSAOAI_BANNER
Example:
[Environment]::SetEnvironmentVariable('PSAOAI_API_AZURE_OPENAI_ENDPOINT','https://<your-endpoint>.openai.azure.com','user')
[Environment]::SetEnvironmentVariable('PSAOAI_API_AZURE_OPENAI_APIVERSION','2024-05-01-preview','user')
[Environment]::SetEnvironmentVariable('PSAOAI_API_AZURE_OPENAI_CC_DEPLOYMENT','your-deployment-name','user')Important
The PSAOAI_API_AZURE_OPENAI_KEY environment variable cannot be provided manually because the PSAOAI module encrypts it for security purposes. Ensure that the key is set and managed through PSAOAI's secure mechanisms.
OLLAMA_ENDPOINTOLLAMA_MODEL
The script sets OLLAMA_ENDPOINT to http://localhost:11434/ by default.
Example:
[Environment]::SetEnvironmentVariable('OLLAMA_MODEL','ollama model, example: phi3:latest','user')Important
For the Ollama provider, you do not need to manually define the OLLAMA_MODEL environment variable before the first run. The script can check the status of Ollama and guide model selection interactively when needed.
OPENAI_API_KEYOPENAI_API_BASE
Example:
[Environment]::SetEnvironmentVariable('OPENAI_API_KEY','lm-studio','user')
[Environment]::SetEnvironmentVariable('OPENAI_API_BASE','http://localhost:1234/v1','user')SERPAPI_API_KEYEXA_API_KEYSERPER_API_KEYAZURE_BING_API_KEYAZURE_BING_ENDPOINT
Example:
[Environment]::SetEnvironmentVariable('SERPAPI_API_KEY','your-serpapi-api-key','user')If you are new to the project, treat the examples below as after-first-run patterns. The recommended demo above remains the canonical starting point.
-
Basic usage
"Monitor RAM usage and show a single color block based on the load." | AIPSTeam.ps1
or
AIPSTeam -userInput "Monitor RAM usage and show a single color block based on the load."
-
Disable live streaming
"Monitor RAM usage" | AIPSTeam.ps1 -Stream $false
-
Work on an existing script
AIPSTeam.ps1 -TheCodePath "C:\UserScripts\script.ps1"
This mode is useful when you want the AI team to improve, debug, or extend an existing PowerShell script instead of starting from a plain-language description.
-
Run without interaction
"Generate a daily system health report." | AIPSTeam.ps1 -NOInteraction
-
Use Ollama
"Recent software activities on Windows 11." | AIPSTeam -LLMProvider "ollama" -Stream $false
-
Load a saved project status
AIPSTeam.ps1 -LoadProjectStatus "path\to\your\Project.xml"
- Main Script:
AIPSTeam.ps1 - Classes:
ProjectTeam - Functions: various utility functions for processing input, logging, and analysis
- ProjectTeam Class: represents a team member with specific expertise
ProcessInputFeedbackAddLogEntryNotifySummarizeMemory
- Utility Functions:
SendFeedbackRequestInvoke-CodeWithPSScriptAnalyzerExport-AndWritePowerShellCodeBlocks
- PowerShell Version: PowerShell 5.1 or later
- Modules:
PSAOAIPSScriptAnalyzerpowerHTML
- Module not found: install the required modules with
Install-Module - Permission issues: run PowerShell as Administrator if your environment requires it
- Script errors: check the generated log files for details
- Provider setup confusion: start with one backend only and use
-NORAGfor the first pass - Too much output noise: use
-NOInteraction -Stream $falsefor a calmer first run - WSL/Linux friction: if the runtime path behaves oddly outside the most typical Windows setup, verify PSAOAI and backend configuration before assuming the script logic is broken
Install-Module -Name PSAOAI
Install-Module -Name PSScriptAnalyzer
Install-Module -Name powerHTMLAIPSTeam.ps1 -Stream $falseLog files are stored in the specified log folder or in the default folder under MyDocuments.
AIPSTeam.ps1 -LoadProjectStatus "path\to\your\Project.xml"AIPSTeam.ps1 -NOPM$env:OLLAMA_MODEL = "your_custom_model"
AIPSTeam.ps1 -LLMProvider "ollama"For deeper setup and first-run guidance, prefer the sections above before treating this FAQ as the main entry point.
