The Problem
You’ve just installed Claude Code, the GitHub Copilot extension, or another AI coding assistant. You run it on your corporate laptop and immediately get SSL certificate errors, connection timeouts, or cryptic authentication failures.
This isn’t a bug in the tool. It’s your corporate network actively tampering with HTTPS traffic.
How Corporate Networks Break LLM Tools
Modern enterprise networks use SSL/TLS inspection proxies to decrypt and re-encrypt HTTPS traffic (see Deep Dive into Deep Packet Inspection). This works transparently for web browsers because browsers trust the corporate CA certificate that gets distributed via Group Policy.
LLM CLI tools and API clients don’t use the system browser trust store by default. They use:
- Bundled CA certificates (Python’s
certifi, Node.js’s built-in CA list) - OpenSSL system store (may or may not include corporate CAs)
- Their own certificate validation logic
When a corporate proxy intercepts the request to api.anthropic.com or api.openai.com, it presents a certificate signed by the corporate CA. The LLM tool doesn’t trust that CA, and rejects the connection.
Observable Symptoms
SSL Certificate Errors
$ claude
Error: unable to verify the first certificate
at TLSSocket.onConnectEnd (_tls_wrap.js:1495)
SSL_ERROR_RX_RECORD_TOO_LONG
certificate verify failed: unable to get local issuer certificate
# Python-based tools:
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.anthropic.com', port=443):
Max retries exceeded... CERTIFICATE_VERIFY_FAILED
Silent Failures / Timeouts
Some proxies perform response tampering — injecting error pages, modifying JSON responses, or stripping headers. This can cause:
- JSON parse errors (proxy injects HTML error page)
- Authentication failures (proxy strips
Authorizationheader) - Rate limit errors (proxy returns its own 429)
- Partial response streaming (proxy buffers or truncates SSE streams)
Streaming Response Issues
LLM APIs commonly use Server-Sent Events (SSE) for streaming. Proxies often:
- Buffer the entire response before forwarding (destroying the streaming UX)
- Strip
Transfer-Encoding: chunkedheaders - Inject content into the stream
Diagnosing the Issue
Step 1: Check If You’re Behind a Proxy
# Check environment proxy settings
echo $HTTP_PROXY $HTTPS_PROXY $http_proxy $https_proxy
# Check system proxy (macOS)
networksetup -getwebproxy Wi-Fi
networksetup -getsecurewebproxy Wi-Fi
# Check system proxy (Windows PowerShell)
Get-ItemProperty -Path 'HKCU:\Software\Microsoft\Windows\CurrentVersion\Internet Settings'
Step 2: Inspect the Certificate Being Presented
# See what certificate the proxy presents for an LLM API endpoint
echo | openssl s_client -connect api.anthropic.com:443 2>/dev/null | openssl x509 -noout -issuer -subject -dates
# If you see your corporate org in the issuer field, DPI is active:
# issuer=O=Contoso Corp, CN=Contoso Secure Proxy CA
Step 3: Test With Certificate Verification Disabled (Diagnostic Only)
# NEVER use this in production — for diagnosis only
curl -k https://api.anthropic.com/v1/messages \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d '{"model":"claude-3-haiku-20240307","max_tokens":10,"messages":[{"role":"user","content":"ping"}]}'
If this works with -k but fails without it, you have a certificate trust problem.
Solutions
Solution 1: Export and Trust the Corporate CA
The proper fix is to add the corporate CA to your tool’s trust store.
Export the corporate CA:
# From a browser (Chrome): Settings > Security > Manage Certificates > Export
# Via OpenSSL from a known HTTPS site
echo | openssl s_client -connect internal.example.com:443 -showcerts 2>/dev/null | \
openssl x509 -outform PEM > /tmp/corporate-ca.pem
Configure tools to use it:
# Python / pip
export REQUESTS_CA_BUNDLE=/path/to/corporate-ca.pem
export SSL_CERT_FILE=/path/to/corporate-ca.pem
# Node.js
export NODE_EXTRA_CA_CERTS=/path/to/corporate-ca.pem
# curl
export CURL_CA_BUNDLE=/path/to/corporate-ca.pem
# Git
git config --global http.sslCAInfo /path/to/corporate-ca.pem
Add to system trust store (Linux):
# Ubuntu/Debian
sudo cp corporate-ca.pem /usr/local/share/ca-certificates/corporate-ca.crt
sudo update-ca-certificates
# RHEL/CentOS/Fedora
sudo cp corporate-ca.pem /etc/pki/ca-trust/source/anchors/
sudo update-ca-trust
Solution 2: Configure Proxy Settings Explicitly
Many tools respect standard proxy environment variables:
export HTTP_PROXY=http://proxy.internal.example.com:8080
export HTTPS_PROXY=http://proxy.internal.example.com:8080
export NO_PROXY=localhost,127.0.0.1,.internal.example.com
Solution 3: Use a Non-Inspected Path
Some environments offer exceptions for specific destinations. Work with your security team to add LLM API endpoints to the SSL inspection bypass list.
Response Tampering Beyond Certificate Substitution
SSL inspection isn’t the only form of tampering. Proxies may also:
Header Injection/Removal
# Proxy may remove:
Authorization: Bearer sk-...
# Proxy may inject:
X-Forwarded-For: 10.0.0.1
Via: 1.1 corporate-proxy
# Some proxies inject authentication tokens they control,
# bypassing your intended authentication
Content Modification
Some DLP (Data Loss Prevention) solutions actively modify response content or block certain patterns in requests (e.g., API keys, personal data). This can break:
- Base64-encoded content in JSON payloads
- Code containing patterns that match DLP rules
- Large payloads that trigger size limits
Connection Hijacking
In extreme cases, proxies may:
- Return cached responses instead of live API responses
- Redirect to error pages that look like API responses
- Modify JSON to strip or alter content
Implications for AI Security
From a security research perspective, corporate SSL inspection creates interesting attack surfaces:
- The proxy as a credential store — SSL inspection proxies often log decrypted traffic, including API keys and tokens
- Proxy as a MITM target — compromise of the inspection device yields decryption capability for all inspected traffic
- Trust chain pollution — adding corporate CAs to developer laptops can enable future interception beyond the intended scope
For organisations deploying LLM tools to corporate users: document the SSL inspection bypass configuration and provide it alongside the tool deployment. This reduces the likelihood of developers disabling certificate verification enterprise-wide as a workaround.
Quick Reference
| Problem | Diagnostic | Fix |
|---|---|---|
| SSL cert error | openssl s_client check issuer | Export corporate CA, set REQUESTS_CA_BUNDLE |
| Streaming broken | Check proxy buffering | Request SSE bypass from security team |
| Auth failures | Check if Authorization header reaches destination | Verify with -v in curl |
| JSON parse errors | Check raw response | Look for proxy HTML injection |
| Timeout | Check HTTPS_PROXY env var | Set explicit proxy or request bypass |