You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add Live Server Integration Testing Beyond Existing Integration Tests
Problem
The OGC-Client-CSAPI library currently has limited live server integration testing despite having excellent unit test coverage (832+ tests, 94% coverage - Issue #19) and comprehensive endpoint integration tests (10 tests, 100% coverage - Issue #18). The existing integration tests validate internal library behavior (URL building, parsing, validation) but do NOT test actual interoperability with real OGC API - Connected Systems servers.
Current Testing Landscape:
What IS Tested ✅:
Unit tests: 832+ tests verify individual components in isolation
"Test Against Reference Implementation - Test against OGC reference implementation (if available), Test against OpenSensorHub, Document interoperability issues, Verify standards compliance in practice"
"The implementation is production-ready and exceeds the documented compliance claims."
However, this verdict is based on code review only, not live server testing. The library achieves 98% theoretical compliance but has zero empirical validation against real OGC CSAPI servers.
Context
This issue was identified during the comprehensive validation conducted January 27-28, 2026.
What this checks: Server declares CSAPI conformance classes
What it DOESN'T verify:
Does server actually implement declared conformance classes?
Are server responses valid per the spec?
Does client correctly parse server responses?
Can client complete end-to-end workflows?
Need: Live testing to verify conformance claims match reality
Real-World Use Cases Requiring Live Testing
1. IoT Dashboard Scenario:
// User wants to display all temperature sensors on a mapconstnavigator=newTypedCSAPINavigator('https://opensensorhub.example.com/csapi');// Does this work with real OpenSensorHub?constsystems=awaitnavigator.getSystems({observedProperty: 'http://www.opengis.net/def/property/OGC/0/Temperature',bbox: [-122.5,37.7,-122.3,37.9],});// Can we parse real GeoJSON responses?systems.data.forEach(system=>{displayMarker(system.geometry.coordinates,system.properties.name);});
Unknowns without live testing:
Does OpenSensorHub support observedProperty filtering?
Is the bbox parameter interpreted correctly?
Are coordinate systems handled properly (WGS84)?
Can we parse OpenSensorHub's GeoJSON format?
2. Data Ingestion Scenario:
// User wants to fetch latest observations for analysisconstdatastream=awaitnavigator.getDatastream('temp-sensor-1');constobservations=awaitnavigator.getDatastreamObservations('temp-sensor-1',{phenomenonTime: 'latest',limit: 1000,});// Process observationsconstvalues=observations.data.map(obs=>obs.properties.result.value);
Unknowns without live testing:
Does server support phenomenonTime: 'latest' special value?
Is pagination enforced (what if server ignores limit)?
Can we parse real observation result formats (SWE Common)?
Are timestamps in correct ISO 8601 format?
3. Command & Control Scenario:
// User wants to issue a command to a sensorconstcommand=awaitnavigator.issueCommand('sensor-ctrl-1',{type: 'Feature',geometry: null,properties: {commandType: 'SET_SAMPLING_RATE',parameters: {rate: 10},},});// Check command statusconststatus=awaitnavigator.getCommandStatus(command.id);
Unknowns without live testing:
Does server accept command format?
Is command execution tracked correctly?
Can we parse command result responses?
Does authentication work for control operations?
Performance and Network Considerations
Real-world network conditions not tested:
Latency: How does library perform with 200ms+ server latency?
Timeouts: Does library handle slow servers gracefully?
Large responses: Can library parse 100MB responses (100k observations)?
Pagination: Does library follow next links correctly?
Rate limiting: Does library respect Retry-After headers?
Connection failures: Does library retry on network errors?
CORS: Does library work from browser with CORS restrictions?
Current assumptions (not verified):
Assumes fast, reliable network connections
Assumes servers respond within default timeout
Assumes all data fits in memory
Assumes no rate limiting
Authentication and Authorization
Supported authentication patterns (theoretical):
From Issue #20 validation, the library accepts arbitrary headers:
Risk: Authentication may not work with real servers despite header support
Proposed Solution
Implement a comprehensive live server integration testing suite that validates interoperability with real OGC API - Connected Systems servers, covering authentication, CRUD operations, error handling, and performance.
Test that library correctly discovers server capabilities:
// tests/live-integration/discovery.spec.tsdescribe('Live Server Discovery',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{describe(`${serverName} server`,()=>{letendpoint: OgcApiEndpoint;beforeAll(async()=>{endpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);});test('loads landing page',async()=>{constlandingPage=awaitendpoint.getLandingPage();expect(landingPage).toHaveProperty('title');expect(landingPage).toHaveProperty('links');expect(landingPage.links).toBeInstanceOf(Array);console.log(`${serverName} title:`,landingPage.title);});test('loads conformance classes',async()=>{constconformance=awaitendpoint.getConformance();expect(conformance).toHaveProperty('conformsTo');expect(conformance.conformsTo).toBeInstanceOf(Array);// Check for CSAPI core conformanceconsthasPart1=conformance.conformsTo.some(c=>c.includes('ogcapi-connected-systems-1')||c.includes('ogcapi-cs-part1'));expect(hasPart1).toBe(true);console.log(`${serverName} conformance classes:`,conformance.conformsTo);});test('detects CSAPI support',async()=>{consthasCSAPI=awaitendpoint.hasConnectedSystems;expect(hasCSAPI).toBe(true);});test('loads CSAPI collections',async()=>{constcollections=awaitendpoint.getCollections();expect(collections).toBeInstanceOf(Array);expect(collections.length).toBeGreaterThan(0);// Should have at least 'systems' collectionconstsystemsCollection=collections.find(c=>c.id==='systems'||c.id.includes('systems'));expect(systemsCollection).toBeDefined();console.log(`${serverName} collections:`,collections.map(c=>c.id));});test('detects available resource types',()=>{constavailableResources=endpoint.csapi.availableResources;expect(availableResources).toContain('systems');if(config.supports.includes('part2')){expect(availableResources).toContain('datastreams');}console.log(`${serverName} resources:`,Array.from(availableResources));});});});});
3. CRUD Operations Testing
Test complete create-read-update-delete workflows:
// tests/live-integration/crud.spec.tsdescribe('Live Server CRUD Operations',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{describe(`${serverName} server`,()=>{letnavigator: TypedCSAPINavigator;letcreatedSystemId: string;beforeAll(async()=>{constendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);navigator=endpoint.csapi.typed();// Setup authenticationif(config.auth.type==='bearer'){navigator.setAuthHeaders({'Authorization': `Bearer ${config.auth.token}`,});}elseif(config.auth.type==='apikey'){navigator.setAuthHeaders({'X-API-Key': config.auth.key,});}});test('CREATE: Post new system',async()=>{constnewSystem={type: 'Feature',geometry: {type: 'Point',coordinates: [-122.419,37.775],},properties: {name: `Test System ${Date.now()}`,description: 'Created by live integration test',systemKind: 'sensor',validTime: ['2026-01-28T00:00:00Z','..'],},};constcreateUrl=navigator.navigator.createSystemUrl();constresponse=awaitnavigator.fetch(createUrl,{method: 'POST',headers: {'Content-Type': 'application/json'},body: JSON.stringify(newSystem),});expect(response.status).toBe(201);// Createdexpect(response.headers.get('Location')).toBeTruthy();constcreated=awaitresponse.json();expect(created).toHaveProperty('id');createdSystemId=created.id;console.log(`${serverName} created system:`,createdSystemId);});test('READ: Get created system',async()=>{constresult=awaitnavigator.getSystem(createdSystemId);expect(result.data).toBeDefined();expect(result.data.id).toBe(createdSystemId);expect(result.data.properties.name).toContain('Test System');console.log(`${serverName} read system:`,result.data.properties.name);});test('UPDATE: Patch system properties',async()=>{constupdateUrl=navigator.navigator.patchSystemUrl(createdSystemId);constpatch={properties: {description: 'Updated by live integration test',},};constresponse=awaitnavigator.fetch(updateUrl,{method: 'PATCH',headers: {'Content-Type': 'application/json'},body: JSON.stringify(patch),});expect(response.status).toBe(200);// Verify updateconstupdated=awaitnavigator.getSystem(createdSystemId);expect(updated.data.properties.description).toBe('Updated by live integration test');});test('DELETE: Remove created system',async()=>{constdeleteUrl=navigator.navigator.deleteSystemUrl(createdSystemId);constresponse=awaitnavigator.fetch(deleteUrl,{method: 'DELETE',});expect(response.status).toBe(204);// No Content// Verify deletionawaitexpect(navigator.getSystem(createdSystemId)).rejects.toThrow();});});});});
4. Query Parameter Testing
Test all query parameters with real servers:
// tests/live-integration/queries.spec.tsdescribe('Live Server Query Parameters',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{describe(`${serverName} server`,()=>{letnavigator: TypedCSAPINavigator;beforeAll(async()=>{constendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);navigator=endpoint.csapi.typed();});test('Pagination: limit parameter',async()=>{constresult=awaitnavigator.getSystems({limit: 5});expect(result.data.length).toBeLessThanOrEqual(5);console.log(`${serverName} returned ${result.data.length} systems (limit: 5)`);});test('Spatial: bbox parameter',async()=>{constresult=awaitnavigator.getSystems({bbox: [-123,37,-121,39],// San Francisco Bay Area});expect(result.data).toBeInstanceOf(Array);// Verify all results within bboxresult.data.forEach(system=>{if(system.geometry&&system.geometry.type==='Point'){const[lon,lat]=system.geometry.coordinates;expect(lon).toBeGreaterThanOrEqual(-123);expect(lon).toBeLessThanOrEqual(-121);expect(lat).toBeGreaterThanOrEqual(37);expect(lat).toBeLessThanOrEqual(39);}});});test('Temporal: datetime parameter',async()=>{constresult=awaitnavigator.getSystems({datetime: {start: '2025-01-01T00:00:00Z',end: '2026-01-01T00:00:00Z',},});expect(result.data).toBeInstanceOf(Array);console.log(`${serverName} returned ${result.data.length} systems for 2025`);});test('Full-text search: q parameter',async()=>{constresult=awaitnavigator.getSystems({q: 'temperature'});expect(result.data).toBeInstanceOf(Array);// Verify results contain search termresult.data.forEach(system=>{consttext=JSON.stringify(system).toLowerCase();expect(text).toMatch(/temperature|temp/);});});test('Property filtering: observedProperty parameter',async()=>{constresult=awaitnavigator.getSystems({observedProperty: 'http://www.opengis.net/def/property/OGC/0/Temperature',});expect(result.data).toBeInstanceOf(Array);console.log(`${serverName} returned ${result.data.length} temperature systems`);});test('Hierarchical: parent + recursive parameters',async()=>{// First, get a system with subsystemsconstallSystems=awaitnavigator.getSystems({limit: 100});constparentSystem=allSystems.data.find(s=>s.properties.subsystems&&s.properties.subsystems.length>0);if(parentSystem){constresult=awaitnavigator.getSystems({parent: parentSystem.id,recursive: true,});expect(result.data).toBeInstanceOf(Array);expect(result.data.length).toBeGreaterThan(0);}});test('Property path: select parameter',async()=>{constresult=awaitnavigator.getSystems({limit: 5,select: 'id,properties.name,geometry',});expect(result.data).toBeInstanceOf(Array);// Verify only requested properties returned (server-dependent)result.data.forEach(system=>{expect(system).toHaveProperty('id');expect(system).toHaveProperty('properties');expect(system.properties).toHaveProperty('name');});});});});});
5. Format Negotiation Testing
Test content type negotiation with Accept headers:
// tests/live-integration/formats.spec.tsdescribe('Live Server Format Negotiation',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{describe(`${serverName} server`,()=>{letnavigator: TypedCSAPINavigator;beforeAll(async()=>{constendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);navigator=endpoint.csapi.typed();});test('GeoJSON format (application/geo+json)',async()=>{consturl=navigator.navigator.getSystemsUrl({limit: 1});constresponse=awaitfetch(url,{headers: {'Accept': 'application/geo+json'},});expect(response.headers.get('Content-Type')).toMatch(/geo\+json/);constdata=awaitresponse.json();expect(data.type).toBe('FeatureCollection');});test('SensorML format (application/sml+json)',async()=>{// Get a system ID firstconstsystems=awaitnavigator.getSystems({limit: 1});if(systems.data.length===0)return;constsystemId=systems.data[0].id;consturl=navigator.navigator.getSystemUrl(systemId,'application/sml+json');constresponse=awaitfetch(url,{headers: {'Accept': 'application/sml+json'},});// Server may not support SensorML formatif(response.ok){expect(response.headers.get('Content-Type')).toMatch(/sml\+json/);constdata=awaitresponse.json();expect(data).toHaveProperty('type');// SensorML type}});test('Plain JSON format (application/json)',async()=>{consturl=navigator.navigator.getSystemsUrl({limit: 1});constresponse=awaitfetch(url,{headers: {'Accept': 'application/json'},});expect(response.headers.get('Content-Type')).toMatch(/json/);expect(response.ok).toBe(true);});});});});
6. Error Handling Testing
Test error responses and edge cases:
// tests/live-integration/errors.spec.tsdescribe('Live Server Error Handling',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{describe(`${serverName} server`,()=>{letnavigator: TypedCSAPINavigator;beforeAll(async()=>{constendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);navigator=endpoint.csapi.typed();});test('404 Not Found: Invalid system ID',async()=>{awaitexpect(navigator.getSystem('nonexistent-system-id-12345')).rejects.toThrow(/404|notfound/i);});test('400 Bad Request: Invalid bbox',async()=>{awaitexpect(navigator.getSystems({bbox: [180,90,-180,-90],// Invalid (min > max)})).rejects.toThrow(/400|badrequest|invalid/i);});test('401 Unauthorized: Missing authentication',async()=>{// Create new navigator without authconstendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);constunauthNav=endpoint.csapi.typed();// Try to create resource (should fail if auth required)constcreateUrl=unauthNav.navigator.createSystemUrl();constresponse=awaitfetch(createUrl,{method: 'POST',headers: {'Content-Type': 'application/json'},body: JSON.stringify({type: 'Feature',properties: {}}),});if(config.auth.type!=='none'){expect([401,403]).toContain(response.status);}});test('500 Internal Server Error: Handling',async()=>{// This depends on server behavior, just verify graceful handlingtry{awaitnavigator.getSystems({limit: 1});}catch(error){expect(error).toBeInstanceOf(Error);expect(error.message).toBeTruthy();}});});});});
7. Performance and Load Testing
Test library performance with real server latency:
// tests/live-integration/performance.spec.tsdescribe('Live Server Performance',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{describe(`${serverName} server`,()=>{letnavigator: TypedCSAPINavigator;beforeAll(async()=>{constendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);navigator=endpoint.csapi.typed();});test('Response time: Small query (<100 items)',async()=>{conststartTime=performance.now();awaitnavigator.getSystems({limit: 10});constendTime=performance.now();constduration=endTime-startTime;console.log(`${serverName} small query: ${duration.toFixed(2)}ms`);// Should complete in reasonable time (accounting for network latency)expect(duration).toBeLessThan(5000);// <5 seconds});test('Response time: Large query (1000+ items)',async()=>{conststartTime=performance.now();constresult=awaitnavigator.getSystems({limit: 1000});constendTime=performance.now();constduration=endTime-startTime;console.log(`${serverName} large query: ${duration.toFixed(2)}ms (${result.data.length} items)`);// Should complete in reasonable timeexpect(duration).toBeLessThan(30000);// <30 seconds});test('Pagination: Following next links',async()=>{lettotalItems=0;letnextUrl=navigator.navigator.getSystemsUrl({limit: 10});letpages=0;while(nextUrl&&pages<10){// Limit to 10 pages for testconstresponse=awaitfetch(nextUrl);constdata=awaitresponse.json();totalItems+=data.features.length;pages++;// Find next linkconstnextLink=data.links?.find((l: any)=>l.rel==='next');nextUrl=nextLink?.href||null;}console.log(`${serverName} pagination: ${totalItems} items across ${pages} pages`);expect(totalItems).toBeGreaterThan(0);});test('Concurrent requests: 10 simultaneous',async()=>{conststartTime=performance.now();constpromises=Array.from({length: 10},(_,i)=>navigator.getSystems({limit: 10}));constresults=awaitPromise.all(promises);constendTime=performance.now();constduration=endTime-startTime;console.log(`${serverName} concurrent: ${duration.toFixed(2)}ms for 10 requests`);expect(results.length).toBe(10);expect(duration).toBeLessThan(10000);// <10 seconds});});});});
8. Part 2 (Advanced) Features Testing
Test datastreams, observations, commands:
// tests/live-integration/part2.spec.tsdescribe('Live Server Part 2 Features',()=>{Object.entries(TEST_SERVERS).forEach(([serverName,config])=>{if(!config.supports.includes('part2')){return;// Skip if server doesn't support Part 2}describe(`${serverName} server`,()=>{letnavigator: TypedCSAPINavigator;beforeAll(async()=>{constendpoint=awaitOgcApiEndpoint.fromUrl(config.baseUrl);navigator=endpoint.csapi.typed();});test('Get datastreams',async()=>{constresult=awaitnavigator.getDatastreams({limit: 10});expect(result.data).toBeInstanceOf(Array);console.log(`${serverName} has ${result.data.length} datastreams`);});test('Get observations for datastream',async()=>{constdatastreams=awaitnavigator.getDatastreams({limit: 1});if(datastreams.data.length===0)return;constdatastreamId=datastreams.data[0].id;constobservations=awaitnavigator.getDatastreamObservations(datastreamId,{limit: 10,});expect(observations.data).toBeInstanceOf(Array);// Verify observation structureif(observations.data.length>0){constobs=observations.data[0];expect(obs).toHaveProperty('properties');expect(obs.properties).toHaveProperty('phenomenonTime');expect(obs.properties).toHaveProperty('result');}});test('Get latest observation (phenomenonTime: latest)',async()=>{constdatastreams=awaitnavigator.getDatastreams({limit: 1});if(datastreams.data.length===0)return;constdatastreamId=datastreams.data[0].id;constobservations=awaitnavigator.getDatastreamObservations(datastreamId,{phenomenonTime: 'latest',});expect(observations.data).toBeInstanceOf(Array);if(observations.data.length>0){console.log(`${serverName} latest observation:`,observations.data[0].properties.phenomenonTime);}});test('Get control streams',async()=>{constresult=awaitnavigator.getControlStreams({limit: 10});expect(result.data).toBeInstanceOf(Array);console.log(`${serverName} has ${result.data.length} control streams`);});test('Get system events',async()=>{constresult=awaitnavigator.getSystemEvents({limit: 10});expect(result.data).toBeInstanceOf(Array);console.log(`${serverName} has ${result.data.length} system events`);});});});});
9. CI/CD Integration
Add live testing to GitHub Actions:
# .github/workflows/live-integration.ymlname: Live Integration Testson:
schedule:
- cron: '0 6 * * *'# Daily at 6 AM UTCworkflow_dispatch: # Manual triggerjobs:
live-tests:
runs-on: ubuntu-lateststrategy:
fail-fast: falsematrix:
server: [opensensorhub, istsos, reference]steps:
- uses: actions/checkout@v3
- name: Setup Node.jsuses: actions/setup-node@v3with:
node-version: '18'
- name: Install dependenciesrun: npm ci
- name: Start test servers (if local)if: matrix.server == 'opensensorhub'run: docker-compose -f docker-compose.test.yml up -d
- name: Wait for serversif: matrix.server == 'opensensorhub'run: | timeout 60 bash -c 'until curl -f http://localhost:8181/csapi; do sleep 1; done'
- name: Run live integration testsenv:
TEST_SERVER: ${{ matrix.server }}OSH_BASE_URL: ${{ secrets.OSH_BASE_URL }}OSH_TOKEN: ${{ secrets.OSH_TOKEN }}ISTSOS_BASE_URL: ${{ secrets.ISTSOS_BASE_URL }}ISTSOS_API_KEY: ${{ secrets.ISTSOS_API_KEY }}OGC_REFERENCE_URL: ${{ secrets.OGC_REFERENCE_URL }}run: npm run test:live -- --server=${{ matrix.server }}
- name: Upload test resultsif: always()uses: actions/upload-artifact@v3with:
name: live-test-results-${{ matrix.server }}path: test-results/live-integration/
- name: Stop test serversif: always() && matrix.server == 'opensensorhub'run: docker-compose -f docker-compose.test.yml down
10. Documentation
Document live testing setup and results:
# docs/LIVE-INTEGRATION-TESTING.md# Live Server Integration Testing
This document describes how to run live integration tests against real OGC API - Connected Systems servers.
## Tested Servers-**OpenSensorHub** - Open-source CSAPI server (Part 1 + Part 2)
-**istSOS** - SOS to CSAPI server (Part 1)
-**OGC Reference** - Reference implementation (Part 1 + Part 2)
## Setup### 1. Configure Test Servers
Create `.env.test` file:
```bash
OSH_BASE_URL=https://opensensorhub.example.com/csapi
OSH_TOKEN=your_bearer_token
ISTSOS_BASE_URL=https://istsos.example.com/csapi
ISTSOS_API_KEY=your_api_key
OGC_REFERENCE_URL=https://ogc-reference.example.com/csapi
2. Run Local Test Servers (Optional)
docker-compose -f docker-compose.test.yml up -d
3. Run Tests
# All servers
npm run test:live
# Specific server
npm run test:live -- --server=opensensorhub
# Specific test file
npm run test:live -- tests/live-integration/crud.spec.ts
Test Coverage
Discovery: Landing page, conformance, collections
CRUD: Create, read, update, delete operations
Queries: All query parameters (bbox, datetime, q, etc.)
Issue prevention: Discovers problems before production deployments
Competitive advantage: Live testing demonstrates maturity and reliability
Impact if Not Addressed:
Unknown interoperability: Library may fail with certain servers despite theoretical compliance
Production surprises: First real-world usage may reveal unexpected issues
Vendor lock-in risk: May only work with specific server implementations
Authentication problems: OAuth2/API key flows may not work as expected
Performance issues: Real server latency may expose problems
User frustration: Issues discovered by users rather than developers
Reputation risk: Failures in production may damage library credibility
When to Prioritize:
User reports interoperability issues: Prioritize immediately if real server problems arise
Before 1.0 release: Include live testing results in 1.0 documentation
Enterprise customers: Required for enterprise adoption (must work with their servers)
Production deployments: Before deploying to production with real servers
OGC certification: If seeking official OGC compliance certification
ROI Assessment:
High for production users: Prevents costly production incidents
High for library credibility: Demonstrates real-world validation
Medium for open source: Shows commitment to quality and standards
Low for prototypes: Overkill for proof-of-concept projects
Best for: Production deployments, enterprise customers, OGC certification
Quick Win Opportunities:
Start with Phase 1-2 (infrastructure + discovery) for 12-18 hours
Provides immediate value: confirms library connects to real servers
Can expand to other phases incrementally as needed
Use Docker Compose for reliable local testing (no external dependencies)
Recommended Approach:
Implement Phases 1-2 (infrastructure + discovery) now for quick wins (12-18 hours)
Defer Phases 3-8 (comprehensive tests) until user demand or production needs
Implement Phase 9 (CI/CD) when running tests regularly
Implement Phase 10 (documentation) when results are stable
Final Note:
This is the FINAL work item (46 of 46) in the comprehensive validation project. Completing this item would provide 100% coverage of all identified validation work items, establishing the OGC-Client-CSAPI library as a fully validated, production-ready, enterprise-grade implementation of the OGC API - Connected Systems standard.
Add Live Server Integration Testing Beyond Existing Integration Tests
Problem
The OGC-Client-CSAPI library currently has limited live server integration testing despite having excellent unit test coverage (832+ tests, 94% coverage - Issue #19) and comprehensive endpoint integration tests (10 tests, 100% coverage - Issue #18). The existing integration tests validate internal library behavior (URL building, parsing, validation) but do NOT test actual interoperability with real OGC API - Connected Systems servers.
Current Testing Landscape:
What IS Tested ✅:
What is NOT Tested ❌:
Current Integration Test Limitations (from Issue #18):
The existing
endpoint.integration.spec.ts(10 tests, 100% coverage) only tests:Missing from existing tests:
Risk Assessment:
Without live server integration testing:
Real-World Impact:
From Issue #20 validation findings:
Key quote from Issue #20:
However, this verdict is based on code review only, not live server testing. The library achieves 98% theoretical compliance but has zero empirical validation against real OGC CSAPI servers.
Context
This issue was identified during the comprehensive validation conducted January 27-28, 2026.
Related Validation Issues:
Work Item ID: 46 from Remaining Work Items
Repository: https://github.com/OS4CSAPI/ogc-client-CSAPI
Validated Commit:
a71706b9592cad7a5ad06e6cf8ddc41fa5387732Note: This is the FINAL work item (46 of 46) - completes the comprehensive validation project!
Detailed Findings
From Issue #20 (OGC Standards Compliance)
The OGC compliance validation achieved 98% compliance but explicitly identified the need for live server testing:
Validation Tasks Section (Lines from Issue #20):
Task #6: Test Against Reference Implementation
Status: ❌ NOT COMPLETED - No live server testing was conducted during validation
Why This Matters:
The validation report states:
However, this assessment is based on:
navigator.ts(2,091 lines)Theoretical vs. Practical Compliance:
Theoretical Compliance (Verified ✅):
Practical Compliance (Not Verified ❌):
Existing Integration Tests (Issue #18)
Current Coverage (100% of internal integration):
What these tests do: Verify URL building and collection parsing with mock data
What these tests DON'T do:
Interoperability Concerns from Issue #20
Known OGC CSAPI Server Implementations:
OpenSensorHub - Open-source OGC CSAPI server
istSOS - Open-source SOS server with CSAPI support
Custom implementations - Organizations building their own servers
Interoperability Risks:
Without testing against multiple servers:
OGC Conformance Class Verification
From Issue #20, the library checks conformance classes:
What this checks: Server declares CSAPI conformance classes
What it DOESN'T verify:
Need: Live testing to verify conformance claims match reality
Real-World Use Cases Requiring Live Testing
1. IoT Dashboard Scenario:
Unknowns without live testing:
observedPropertyfiltering?2. Data Ingestion Scenario:
Unknowns without live testing:
phenomenonTime: 'latest'special value?3. Command & Control Scenario:
Unknowns without live testing:
Performance and Network Considerations
Real-world network conditions not tested:
nextlinks correctly?Retry-Afterheaders?Current assumptions (not verified):
Authentication and Authorization
Supported authentication patterns (theoretical):
From Issue #20 validation, the library accepts arbitrary headers:
What this enables:
AuthorizationheaderWhat is NOT tested:
Risk: Authentication may not work with real servers despite header support
Proposed Solution
Implement a comprehensive live server integration testing suite that validates interoperability with real OGC API - Connected Systems servers, covering authentication, CRUD operations, error handling, and performance.
1. Test Environment Setup
Target Servers for Testing:
Environment Variables:
# .env.test OSH_BASE_URL=https://opensensorhub.example.com/csapi OSH_TOKEN=your_bearer_token_here ISTSOS_BASE_URL=https://istsos.example.com/csapi ISTSOS_API_KEY=your_api_key_here OGC_REFERENCE_URL=https://ogc-reference.example.com/csapiDocker Compose for Local Testing:
2. Server Discovery and Conformance Testing
Test that library correctly discovers server capabilities:
3. CRUD Operations Testing
Test complete create-read-update-delete workflows:
4. Query Parameter Testing
Test all query parameters with real servers:
5. Format Negotiation Testing
Test content type negotiation with Accept headers:
6. Error Handling Testing
Test error responses and edge cases:
7. Performance and Load Testing
Test library performance with real server latency:
8. Part 2 (Advanced) Features Testing
Test datastreams, observations, commands:
9. CI/CD Integration
Add live testing to GitHub Actions:
10. Documentation
Document live testing setup and results:
2. Run Local Test Servers (Optional)
3. Run Tests
Test Coverage
Results
Test results are stored in
test-results/live-integration/:{server}-discovery.json- Server discovery results{server}-crud.json- CRUD operation results{server}-queries.json- Query parameter results{server}-performance.json- Performance metricsKnown Issues
Interoperability Matrix
tests/
live-integration/
config.ts (~150 lines) - Server configuration
discovery.spec.ts (~200 lines) - Discovery and conformance
crud.spec.ts (~400 lines) - CRUD operations
queries.spec.ts (~350 lines) - Query parameter testing
formats.spec.ts (~150 lines) - Format negotiation
errors.spec.ts (~200 lines) - Error handling
performance.spec.ts (~250 lines) - Performance testing
part2.spec.ts (~300 lines) - Part 2 features
helpers/
test-utils.ts (~200 lines) - Test utilities
auth.ts (~100 lines) - Authentication helpers
cleanup.ts (~100 lines) - Resource cleanup
docker-compose.test.yml (~100 lines) - Local test servers
.env.test.example (~20 lines) - Environment template
.github/
workflows/
live-integration.yml (~150 lines) - CI/CD workflow
docs/
LIVE-INTEGRATION-TESTING.md (~500 lines) - Documentation
Package.json Scripts
{ "scripts": { "test:live": "jest --testMatch='**/live-integration/**/*.spec.ts' --runInBand", "test:live:docker": "docker-compose -f docker-compose.test.yml up -d && npm run test:live && docker-compose -f docker-compose.test.yml down" } }Implementation Phases
Phase 1: Infrastructure (8-12 hours)
Phase 2: Discovery Tests (4-6 hours)
Phase 3: CRUD Tests (10-14 hours)
Phase 4: Query Tests (8-12 hours)
Phase 5: Format Tests (4-6 hours)
Phase 6: Error Tests (6-8 hours)
Phase 7: Performance Tests (8-10 hours)
Phase 8: Part 2 Tests (8-12 hours)
Phase 9: CI/CD Integration (6-8 hours)
Phase 10: Documentation (8-12 hours)
LIVE-INTEGRATION-TESTING.mdTotal Estimated Effort: 70-100 hours (1.75-2.5 weeks)
Known Challenges
1. Server Availability:
2. Authentication:
3. Data Persistence:
afterAll()hooks4. Server Differences:
5. Network Flakiness:
Priority Justification
Priority: Low
Justification:
Why Low Priority:
Why Still Important:
Impact if Not Addressed:
When to Prioritize:
ROI Assessment:
Quick Win Opportunities:
Recommended Approach:
Final Note:
This is the FINAL work item (46 of 46) in the comprehensive validation project. Completing this item would provide 100% coverage of all identified validation work items, establishing the OGC-Client-CSAPI library as a fully validated, production-ready, enterprise-grade implementation of the OGC API - Connected Systems standard.