Generate production-ready infrastructure from plain English descriptions
Text2IaC transforms natural language descriptions into production-ready infrastructure code. Simply describe what you need in plain English, and get:
git clone https://github.com/devopsterminal/text2iac-platform.git
cd text2iac-platform
# Copy environment template
cp .env.example .env
# Start all services
make start
# Monitor Ollama logs
docker-compose logs -f ollama
# Should see: "Mistral 7B model loaded successfully"
# Send test email (if SMTP configured)
echo "Create a Node.js API with PostgreSQL database" | \
mail -s "[TEXT2IAC] Test API" infrastructure@localhost
# Open web interface
open http://localhost:8080
# Or manually navigate to http://localhost:8080
Subject: [TEXT2IAC] User Management API
Create a Node.js REST API with:
- User authentication (JWT)
- PostgreSQL database
- Redis caching
- Auto-scaling setup
- Basic monitoring
Expected traffic: 1000 requests/hour
Environment: Production
Generated Output:
Build an e-commerce platform with:
- Product catalog (Elasticsearch)
- Shopping cart (Redis)
- Payment processing (Stripe integration)
- Order management (PostgreSQL)
- Admin dashboard
- Real-time analytics
Generated Output:
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Email │────│ Text2IaC │────│ Generated │
│ Integration │ │ API Server │ │ Infrastructure │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Web Interface │────│ Mistral 7B │────│ Templates │
│ Dashboard │ │ (via Ollama) │ │ & Examples │
└─────────────────┘ └─────────────────┘ └─────────────────┘
Send infrastructure requests to infrastructure@yourcompany.com
:
To: infrastructure@yourcompany.com
Subject: [TEXT2IAC] Project Name
Describe your infrastructure needs in plain English...
Access the web dashboard at http://localhost:8080
:
Direct API access for programmatic use:
# Generate infrastructure
curl -X POST http://localhost:3001/api/generate \
-H "Content-Type: application/json" \
-d '{"description": "Create a blog with comments", "environment": "production"}'
# Check status
curl http://localhost:3001/api/status/{request_id}
# Core Settings
OLLAMA_MODEL=mistral:7b
API_PORT=3001
WEB_PORT=8080
# Email Configuration (optional)
SMTP_HOST=mail.company.com
SMTP_USER=infrastructure@company.com
SMTP_PASS=your-password
IMAP_HOST=mail.company.com
# Database
DB_HOST=postgres
DB_NAME=text2iac
DB_USER=text2iac
DB_PASS=secure-password
# Security
JWT_SECRET=your-jwt-secret
API_KEY=your-api-key
If you want email integration, configure SMTP/IMAP:
SMTP_HOST=smtp.gmail.com
IMAP_HOST=imap.gmail.com
SMTP_USER=infrastructure@company.com
SMTP_PASS=app-specific-password
SMTP_HOST=smtp.office365.com
IMAP_HOST=outlook.office365.com
SMTP_HOST=mail.company.internal
IMAP_HOST=mail.company.internal
# Check all services
make health-check
# Individual service health
curl http://localhost:3001/health # API Server
curl http://localhost:8080/health # Web Interface
curl http://localhost:11434/api/ps # Ollama LLM
# View all logs
docker-compose logs -f
# Specific service logs
docker-compose logs -f api
docker-compose logs -f email-bridge
docker-compose logs -f ollama
http://localhost:3001/metrics
http://localhost:9090
(if Prometheus enabled)http://localhost:3000
(if Grafana enabled)# Start in development mode
make dev
# Run tests
make test
# Code formatting
make format
# Type checking
make lint
templates/
directoryapi/src/services/template.service.ts
Modify LLM prompts in config/prompts/
:
system-prompt.txt
- Base instructions for LLMterraform-prompt.txt
- Terraform-specific guidancekubernetes-prompt.txt
- Kubernetes-specific guidance# Production deployment
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
# With monitoring stack
docker-compose -f docker-compose.yml -f docker-compose.monitoring.yml up -d
# Apply manifests
kubectl apply -f k8s/
# Check status
kubectl get pods -n text2iac
git checkout -b feature/amazing-feature
)git commit -m 'Add amazing feature'
)git push origin feature/amazing-feature
)See CONTRIBUTING.md for detailed guidelines.
LLM not responding:
# Check Ollama status
curl http://localhost:11434/api/ps
# Restart Ollama
docker-compose restart ollama
Email not working:
# Check email bridge logs
docker-compose logs email-bridge
# Test SMTP connection
telnet $SMTP_HOST 587
Web interface not loading:
# Check frontend logs
docker-compose logs frontend
# Verify API connection
curl http://localhost:3001/health
For better LLM performance:
For high request volume:
docker-compose up --scale api=3
)This project is licensed under the Apache License - see the LICENSE file for details.
Made with ❤️ by the Platform Engineering Team