TypeFetched/CHANGELOG.md
Casey Collier feeba8d1a0 feat: Release v0.2.0 - Major streaming and performance improvements
- Add native SSE streaming support with streamSSE() method
- Add POST body support to all streaming methods (stream, streamJSON, streamSSE)
- Add race() method for first successful response from multiple endpoints
- Add batch() method for efficient bulk operations with concurrency control
- Add parallel() method using Web Workers for true parallelism
- Add uploadResumable() with adaptive chunking and progress tracking
- Add uploadStream() for large file/model uploads
- Add throttled() method for bandwidth limiting
- Add streamWithReconnect() for resilient streaming with auto-reconnection
- Remove duplicate withExponentialBackoff (already in retry logic)
- Remove overly specialized AI methods (trackTokenUsage, parallelInference)
- Fix TypeScript strict mode compatibility
- Add comprehensive CHANGELOG.md
- Bundle size: 11KB gzipped (up from 8KB, still under 15KB target)

BREAKING CHANGES:
- Removed withExponentialBackoff() - use built-in retry config instead
- Removed trackTokenUsage() - too specialized for general HTTP client
- Removed parallelInference() - use generic parallel() method instead

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-20 18:34:01 -04:00

2.2 KiB

Changelog

All notable changes to TypedFetch will be documented in this file.

[0.2.0] - 2025-01-20

Added

  • SSE Streaming Support: Native Server-Sent Events with streamSSE() method
  • POST Body Support for Streaming: All streaming methods now accept request bodies for AI/ML workloads
  • Race Method: race() for getting the first successful response from multiple endpoints
  • Batch Processing: batch() method with concurrency control for efficient bulk operations
  • Parallel Requests: parallel() method using Web Workers for true parallelism
  • Resumable Uploads: uploadResumable() with adaptive chunking and progress tracking
  • Streaming Uploads: uploadStream() for large file/model uploads
  • Bandwidth Throttling: throttled() method using token bucket algorithm
  • Auto-Reconnecting Streams: streamWithReconnect() for resilient streaming

Changed

  • Bundle size increased to 11KB gzipped (from 8KB) due to new features
  • Improved TypeScript types for all new methods
  • Enhanced error handling for streaming operations

Removed

  • Removed duplicate withExponentialBackoff implementation (already in retry logic)
  • Removed AI-specific trackTokenUsage method (too specialized)
  • Removed parallelInference method (replaced with generic parallel())

Fixed

  • Fixed streaming methods to properly resolve relative URLs with baseURL
  • Fixed TypeScript strict mode compatibility issues

[0.1.3] - 2025-01-19

Fixed

  • Fixed streaming methods (stream() and streamJSON()) to support baseURL for relative URLs
  • Fixed "Failed to parse URL" errors when using streaming with configured instances

[0.1.2] - 2025-01-18

Added

  • Initial streaming support with stream() and streamJSON() methods
  • Circuit breaker pattern for fault tolerance
  • Request deduplication
  • W-TinyLFU caching algorithm

[0.1.1] - 2025-01-17

Added

  • Basic retry logic with exponential backoff
  • Request/response interceptors
  • Cache management

[0.1.0] - 2025-01-16

Added

  • Initial release
  • Type-safe HTTP client with Proxy-based API
  • Zero dependencies
  • Support for all HTTP methods
  • Automatic JSON parsing
  • Error handling with detailed context