Consumer API
The BadgerFy.ai Consumer API is designed to provide programmatic access to key platform functionalities, primarily focusing on automating data source uploads. This allows developers to build custom integrations and maintain up-to-date knowledge bases for their AI agents.
The Consumer API is available on Pro and Business plans only. Basic plan subscribers can upgrade to access automated data source updates via API.
Never reveal your API keys in your client-side code (frontend) or commit them directly to your codebase. Treat them as highly sensitive credentials. Always store them as secure environment variables and implement a practice of rotating these API keys regularly.
Available Endpoints
The Consumer API provides the following endpoints for managing data source files:
File Endpoints
GET /datasources/{datasourceId}/files- List all files in a datasourceGET /datasources/{datasourceId}/files/by-name/{name}- Get a file by its reference namePOST /datasources/{datasourceId}/files- Upload a new filePUT /datasources/{datasourceId}/files/{fileId}- Update an existing fileDELETE /datasources/{datasourceId}/files/{fileId}- Delete a file
Order Data Endpoints
For datasources configured with format order-data (used with Recommendation Strips):
POST /datasources/{datasourceId}/orders- Create order dataPUT /datasources/{datasourceId}/orders/{fileId}- Update order data
Product-Promo Endpoints
For datasources configured with format product-promo (used with Nudge and Exit-Offer agents):
POST /datasources/{datasourceId}/product-promo- Create product-promo dataPUT /datasources/{datasourceId}/product-promo/{fileId}- Update product-promo data
Job Endpoints
GET /jobs/{jobId}- Check processing job statusGET /datasources/{datasourceId}/jobs- List all jobs for a datasource
Creating a New Data Source File
To upload a new file to your data source, use the POST endpoint:
- Upload the File: Send a
POSTrequest to/datasources/{datasourceId}/fileswith your file and a unique reference name. - Name Tokenization: The name you provide will be automatically tokenized to be URL-safe (e.g., "My Product Catalog 2024" becomes "my-product-catalog-2024").
- Track Processing: The response includes a
jobId. UseGET /jobs/{jobId}to poll for status. - Completion: When the job status changes to
completed, the response includes thedataSourceFileIdandnameof the created file. The file is immediately available to your AI agents.
Store your file's reference name in your configuration. You can later use GET /datasources/{datasourceId}/files/by-name/{name} to retrieve the file ID without having to list all files.
Updating an Existing Data Source File
To replace an existing file with a new version, use the PUT endpoint. This is the recommended approach for keeping your data sources synchronized:
- Get the File ID: Use
GET /datasources/{datasourceId}/files/by-name/{name}to retrieve the current file's ID using its reference name. - Upload the Replacement: Send a
PUTrequest to/datasources/{datasourceId}/files/{fileId}with your new file. - Automatic Handling: The system will:
- Process the new file
- Keep the original file active during processing (zero downtime)
- Only delete the old file after the new one is successfully processed
- Preserve the original reference name
- Track Processing: Use the returned
jobIdto monitor the update progress.
The update endpoint ensures zero downtime during file replacements. Your AI agents continue using the existing data until the new version is fully processed and ready.
Order Data for Recommendation Strips
If you're using Recommendation Strip agents, you can upload order/purchase history data directly via the API. Order data is processed differently from regular files—it bypasses embedding generation and is stored as structured JSON for fast product lookups.
Order data can only be uploaded to datasources created with useOrderData: true. Regular file uploads are blocked on order data datasources, and vice versa.
Order Data Format
Send a JSON body with name (reference name) and orders array:
{"name": "order-history-2024","orders": [{"orderId": "ORD-12345","orderDate": "2024-01-15","metadata": { "source": "api" },"products": [{"displayName": "Wireless Mouse","productPath": "/products/wireless-mouse","category": "Electronics","productCode": "WM-001","sku": "SKU-12345","thumbUrl": "https://example.com/images/mouse.jpg","purchasePrice": "29.99"}]}]}
Creating Order Data
Send a POST request to /datasources/{datasourceId}/orders:
- Include both
nameandordersin the JSON body - Each order must have
orderId,orderDate, and at least one product - Each product requires:
displayName,productPath,category,productCode,sku,thumbUrl, andpurchasePrice - The response includes a
jobIdfor tracking and validation stats
Updating Order Data
To replace existing order data, use PUT /datasources/{datasourceId}/orders/{fileId}:
- Get the file ID using
GET /datasources/{datasourceId}/files/by-name/{name} - Send the updated
ordersarray in the request body - The original data remains active until the new data is processed (zero downtime)
Order data processing is significantly faster than regular file uploads since it skips embedding generation. Your recommendation strips can access the new data within seconds of upload completion.
Product-Promo Data for Nudge & Exit-Offer Agents
If you're using Nudge or Exit-Offer agents, you can upload product and promotion data directly via the API. This data powers personalized promotional messages and exit intent offers.
Product-promo data can only be uploaded to datasources created with format product-promo. Regular file uploads are blocked on product-promo datasources, and vice versa.
The API accepts a single pre-formatted JSON file at a time. For uploading multiple files with AI-powered field mapping (CSV, JSON from different sources like Shopify exports), use the BadgerFy.ai dashboard to upload and process files.
Product-Promo Data Format
Send a JSON body with name (reference name) and records array. Each record represents one product with a variants array (at least one variant per product). Do not flatten variants into separate records—all SKUs for a product go in that product's variants array.
{"name": "summer-sale-promos","records": [{"productPath": "/products/wireless-headphones","productName": "Premium Wireless Headphones","category": "Electronics","productCode": "WH-1000","highPrice": "149.99","lowPrice": "149.99","productThumbUrl": "https://example.com/images/headphones.jpg","metadata": { "brand": "TechAudio" },"variants": [{"variantName": "Premium Wireless Headphones - Black","variantPath": "/products/wireless-headphones?variant=blk","variantPrice": "149.99","sku": "WH-1000-BLK"}],"promotions": [{"promotionType": "percent","promotionValue": 20,"promotionText": "Summer Sale - 20% Off!","discountPrice": "119.99","discountCode": "SUMMER20","ctaText": "Shop Now","ctaLink": "/products/wireless-headphones?promo=summer20","validFrom": "2024-06-01T00:00:00Z","validUntil": "2024-08-31T23:59:59Z"}]},{"productPath": "/products/running-shoes","productName": "Ultra Comfort Running Shoes","category": "Footwear","productCode": "RS-500","highPrice": "89.99","lowPrice": "89.99","variants": [{"variantName": "Ultra Comfort Running Shoes - Size 10","variantPath": "/products/running-shoes","variantPrice": "89.99","sku": "RS-500-10"}],"promotions": [{"promotionType": "dollar","promotionValue": 15,"promotionText": "$15 Off All Running Shoes","discountPrice": "74.99"},{"promotionType": "shipping","promotionText": "Free Shipping on Orders $50+","ctaLink": "/shipping-policy","ctaText": "Learn More"}]},{"productPath": "/products/fitness-bundle","productName": "Complete Fitness Bundle","category": "Fitness","productCode": "FB-PRO-001","highPrice": "299.99","lowPrice": "299.99","productThumbUrl": "https://example.com/images/fitness-bundle.jpg","variants": [{"variantName": "Complete Fitness Bundle","variantPath": "/products/fitness-bundle","variantPrice": "299.99","sku": "FB-PRO-001"}],"promotions": [{"promotionType": "bundle","promotionValue": 50,"promotionText": "Save $50 when you buy the complete set!","discountPrice": "249.99","ctaText": "Get Bundle Deal","ctaLink": "/products/fitness-bundle#bundle-offer"}]}]}
Product Record Fields
Required fields (product level):
productPath- URL path to product page (e.g., "/products/wireless-headphones", "/product/my-item")productName- Base product name (no variant attributes)category- Product category for grouping and filteringproductCode- Parent/main SKU identifier tying variants togetherhighPrice- Highest variant price as a string (e.g.,"149.99")lowPrice- Lowest variant price as a string (e.g.,"99.99")variants- Array of at least one variant object (see below)
Required fields (each variant in variants):
variantName- Product name plus variant attributes (e.g., "Headphones - Black / Large")variantPath- Full path to this variant (e.g., product path + query params)variantPrice- Price for this variant as a string (e.g.,"99.99")sku- Variant SKU identifier
Optional fields (product level):
productThumbUrl- URL to product thumbnail imagedescription- Short product description (plain text, up to 300 characters)promotions- Array of promotion objectsmetadata- Optional object for custom fields (brand, tags, etc.)
Optional fields (variant level):
variantThumbUrl- Thumbnail URL for this variantmetadata- Variant-specific options (e.g., color, size)
Promotion Object Fields
Each product must have at least one promotion. All promotion fields are optional, but you should include enough information for meaningful display:
promotionType- Type of promotion. Valid values:percent,dollar,shipping,bundle,bogo,quantity,gift,rebate,coupon,sale,clearance,flash,otherpromotionValue- Numeric discount value (e.g.,20for 20% off or $20 off depending on type)promotionText- Human-readable promotion message shown to usersdiscountPrice- Final price after discount as a string (e.g.,"79.99")discountCode- Coupon code the customer can use at checkoutctaLink- Call-to-action link URL (e.g., product page, promo landing page)ctaText- Call-to-action button text (e.g., "Shop Now", "Learn More")validFrom- Promotion start date in ISO 8601 formatvalidUntil- Promotion end date in ISO 8601 format
A single product can have multiple promotions. For example, a product might have both a percentage discount and free shipping. The Nudge/Exit-Offer agent will intelligently select the most relevant promotion to display based on context.
Creating Product-Promo Data
Send a POST request to /datasources/{datasourceId}/product-promo:
- Include both
nameandrecordsin the JSON body - Each record must have
productPath,productName,category,productCode,highPrice,lowPrice, and avariantsarray with at least one variant (each variant needsvariantName,variantPath,variantPrice,sku) - The response includes the
dataSourceFileIdandrecordCount
Updating Product-Promo Data
To replace existing product-promo data, use PUT /datasources/{datasourceId}/product-promo/{fileId}:
- Get the file ID using
GET /datasources/{datasourceId}/files/by-name/{name} - Send the updated
recordsarray in the request body - All existing records are replaced with the new records
Product-promo data is stored directly without queue processing, making updates nearly instantaneous. Your Nudge and Exit-Offer agents immediately see the new promotions.
Example Automation Script Flow
Here's a typical automation flow for keeping your data synchronized:
- Configuration: Store your datasource ID and file reference name in environment variables or a config file.
- Check for Existing File: Call
GET /datasources/{datasourceId}/files/by-name/{name}. - Create or Update:
- If file exists → Use
PUTto update it - If file doesn't exist → Use
POSTto create it
- If file exists → Use
- Poll for Completion: Check
GET /jobs/{jobId}until the job completes or fails. - Schedule: Run this on a cron schedule (e.g., daily) to keep your AI agents up-to-date.
API Usage Guidelines
- Upload Limits: The same upload limits (e.g., 10MB per file) apply to API uploads as they do to manual uploads through the dashboard.
- One Job at a Time: Only one file can be processed per datasource at a time. Check
GET /datasources/{datasourceId}/jobsbefore uploading to ensure no jobs are in progress. - Scraped Files: Files created via website scraping cannot be updated via the API. Delete and re-scrape using the dashboard instead.
- Management: You can manage and view all uploaded files and data sources on the BadgerFy.ai data source page in your dashboard.
- No Website Scraping API: Currently, we do not offer an API endpoint for programmatically initiating website scraping jobs.
API Reference
For detailed information on all available endpoints, request/response formats, and authentication methods, please refer to our Swagger documentation: