API v1 — Generally Available

API & Exports

Connect RollForge portfolio data to your data warehouse, BI tools, or workflow automation. Three channels: real-time webhooks, REST API, and scheduled CSV exports.

Authentication

All REST API requests require a Bearer token. Generate API keys from your Integrations dashboard. Keys are scoped to your organization — all data returned is org-scoped automatically.

HTTP Header
Authorization: Bearer rfk_live_<your_api_key>
curl example
curl -H "Authorization: Bearer rfk_live_abc123..." \
  https://rollforgeops.ai/api/v1/portcos
⚠️
Keys are shown once. Copy and store your API key securely when created — it will not be displayed again. Revoke compromised keys immediately from the integrations dashboard.

Rate Limits

1,000 requests per hour per API key. Limit resets at the top of each UTC hour.

When you exceed the limit you receive a 429 Too Many Requests response. Back off and retry after the window resets.

# 429 response
{
  "error": "Rate limit exceeded: 1000 requests/hour"
}

Pagination

All list endpoints return paginated results. Pass page and per_page query parameters.

ParamDefaultMaxDescription
page1Page number (1-indexed)
per_page50200Results per page
Response envelope
{
  "data": [...],
  "meta": {
    "page": 1,
    "per_page": 50,
    "total": 312,
    "pages": 7
  }
}

Portcos

Portfolio company records — one row per portco your firm manages.

GET /api/v1/portcos List all portcos
Query ParamTypeDescription
portco_idintegerFilter to a single portco
sinceISO 8601Only rows updated after this timestamp
untilISO 8601Only rows updated before this timestamp
pageintegerPage number (default 1)
per_pageintegerResults per page (max 200)
curl
curl -H "Authorization: Bearer rfk_live_..." \
  "https://rollforgeops.ai/api/v1/portcos?per_page=100"
Response row
{
  "id": 1,
  "name": "Blue Ridge HVAC",
  "vertical": "HVAC",
  "geography": "Atlanta, GA",
  "tech_count": 18,
  "annual_revenue": 4200000,
  "ebitda": 630000,
  "close_date": "2024-03-15",
  "status": "active",
  "created_at": "2024-03-15T09:00:00Z",
  "updated_at": "2025-05-01T14:22:00Z"
}

KPIs

Vendor spend and operational KPI data per portco and category.

GET /api/v1/kpis List KPI data
curl
curl -H "Authorization: Bearer rfk_live_..." \
  "https://rollforgeops.ai/api/v1/kpis?portco_id=1"

Operating Scores

Composite 0–100 operating scores per portco, computed daily across 8 weighted modules: Pricing, Workforce, Memberships, Marketing, Conversion, Integration, Compliance, Financial.

GET /api/v1/scores List operating scores
curl
curl -H "Authorization: Bearer rfk_live_..." \
  "https://rollforgeops.ai/api/v1/scores?since=2025-01-01"
Response row
{
  "portco_id": 1,
  "portco_name": "Blue Ridge HVAC",
  "overall_score": 82.4,
  "grade": "B",
  "trend_30d": +3.1,
  "trend_90d": +7.8,
  "top_strength": "Strong close rate (68% vs 60% P50)",
  "top_risk": "3 expired licenses (compliance −22pts)",
  "computed_at": "2025-05-05T02:00:00Z"
}

Pipeline

Acquisition pipeline deals at all stages: sourced → screening → LOI → diligence → closing → closed_won / passed.

GET /api/v1/pipeline List pipeline deals
Query ParamTypeDescription
stagestringFilter by stage: sourced, screening, loi, diligence, closing, closed_won, passed
sinceISO 8601Updated after timestamp
portco_idintegerFilter by converted portco ID
curl
curl -H "Authorization: Bearer rfk_live_..." \
  "https://rollforgeops.ai/api/v1/pipeline?stage=diligence"

Compliance

Credential records per portco — licenses, insurance policies, bonds, certifications. Filter by status to pull expiring or expired items.

GET /api/v1/compliance List compliance credentials
Query ParamDescription
portco_idFilter to one portco
statuscurrent, expiring_soon, expired, unknown
since / untilUpdated at range
curl — pull all expired credentials
curl -H "Authorization: Bearer rfk_live_..." \
  "https://rollforgeops.ai/api/v1/compliance?status=expired"

Webhooks

Receive real-time events via HTTP POST to any HTTPS endpoint you control. Configure webhooks from the Webhooks dashboard.

💡
Self-service setup. Create a webhook endpoint, paste your URL, select events, and copy your signing secret. No engineering ticket required.

Event Reference

EventTriggered when
portco.createdNew portfolio company added to your org
portco.updatedPortco record updated (revenue, headcount, status, etc.)
kpi.recomputedKPI data refreshed for a portco
score.recomputedOperating score recalculated for a portco
compliance.alertA credential expires within 30 days or is expired
pipeline.stage_changedA deal moves to a new stage in the acquisition pipeline
membership.churnedA service membership cancels or lapses at a portco
Payload envelope
{
  "event": "score.recomputed",
  "occurred_at": "2025-05-05T14:22:00Z",
  "data": {
    // event-specific payload
  }
}

Signature Verification

Every delivery includes an X-RollForge-Signature header. Verify it to reject forged requests.

Node.js — verify signature
const crypto = require('crypto');

function verifySignature(rawBody, signatureHeader, secret) {
  const expected = 'sha256=' + crypto
    .createHmac('sha256', secret)
    .update(rawBody)
    .digest('hex');
  return crypto.timingSafeEqual(
    Buffer.from(expected),
    Buffer.from(signatureHeader)
  );
}

app.post('/webhook', express.raw({type: 'application/json'}), (req, res) => {
  const sig = req.headers['x-rollforge-signature'];
  if (!verifySignature(req.body, sig, process.env.WEBHOOK_SECRET)) {
    return res.status(401).send('Invalid signature');
  }
  const event = JSON.parse(req.body);
  // process event...
  res.status(200).send('OK');
});
Python — verify signature
import hmac, hashlib

def verify_signature(raw_body: bytes, header: str, secret: str) -> bool:
    expected = 'sha256=' + hmac.new(
        secret.encode(), raw_body, hashlib.sha256
    ).hexdigest()
    return hmac.compare_digest(expected, header)

Retry Logic

Failed deliveries (non-2xx response or connection timeout) are retried automatically:

AttemptDelay
1st retry1 minute
2nd retry5 minutes
3rd retry30 minutes

After 3 failed attempts the delivery is marked failed. View the full delivery log in your Webhooks dashboard.

ℹ️
Respond fast. Your endpoint should return a 2xx response immediately and process the payload asynchronously. RollForge times out after 10 seconds.

Scheduled CSV Exports

Configure recurring dataset exports on an hourly, daily, or weekly schedule. Deliver to Amazon S3 (presigned PUT URL), SFTP, or trigger on-demand downloads. Configure from the Exports dashboard.

Datasets available: portcos, kpis, scores, pipeline, compliance, workforce, memberships.

Incremental sync: Enable to export only rows modified since the last successful run using an updated_at watermark. Ideal for large portcos with high-frequency KPI updates.

Dataset Schemas

portcos

idintegerUnique portco ID
nametextCompany name
verticaltextHVAC, Plumbing, Electrical, Mixed
geographytextCity, state
tech_countintegerNumber of field technicians
annual_revenuenumericAnnual revenue (USD)
ebitdanumericEBITDA (USD)
close_datedateAcquisition close date
statustextactive, exited, hold
created_attimestamptzRecord creation time (UTC)
updated_attimestamptzLast updated (UTC) — used for incremental watermark

scores

portco_idintegerFK to portcos.id
portco_nametextDenormalized name
overall_scorenumeric(5,1)0–100 composite score
gradetextA, B, C, D, F
trend_30dnumericScore delta vs 30 days ago (positive = improving)
trend_90dnumericScore delta vs 90 days ago
top_strengthtextHuman-readable top strength
top_risktextHuman-readable top risk
computed_attimestamptzScore computation timestamp

compliance

idintegerCredential record ID
portco_idintegerFK to portcos.id
portco_nametextDenormalized portco name
credential_nametextLicense or policy name
credential_typetextstate_license, insurance, bond, epa_cert, etc.
statustextcurrent, expiring_soon, expired, unknown
expiry_datedateExpiration date
holder_nametextLicense holder
jurisdictiontextState or locality
license_numbertextLicense number
updated_attimestamptzUsed for incremental watermark

Snowflake

Use an External Stage pointing at your S3 bucket, then COPY INTO your target table on a schedule.

  1. In RollForge, create a daily CSV export for your dataset with destination = S3. Provide a presigned PUT URL for your S3 bucket.
  2. In Snowflake, create an external stage on the same S3 bucket.
  3. Schedule a Snowflake Task that runs COPY INTO after each export window.
Snowflake SQL — one-time setup
CREATE STAGE rollforge_stage
  URL = 's3://your-bucket/rollforge-exports/'
  CREDENTIALS = (AWS_KEY_ID = '...' AWS_SECRET_KEY = '...');

COPY INTO portcos_raw
  FROM @rollforge_stage/portcos_
  FILE_FORMAT = (TYPE = CSV SKIP_HEADER = 1)
  PATTERN = '.*portcos.*\.csv';

BigQuery

Use BigQuery Data Transfer with an S3 source, or load directly from Cloud Storage if you relay exports there.

  1. Configure a daily S3 export in RollForge.
  2. Set up a Cloud Storage sync from S3 (AWS CLI or Lambda) to a GCS bucket.
  3. Create a BigQuery Data Transfer job from the GCS bucket, scheduled to run daily.
bq load — ad-hoc
bq load --autodetect --source_format=CSV \
  myproject:rollforge.portcos \
  gs://your-bucket/rollforge-exports/portcos_*.csv

Fivetran

Use Fivetran's REST API connector to pull data from RollForge's REST API v1.

  1. In Fivetran, add a new connector → "REST API".
  2. Base URL: https://rollforgeops.ai/api/v1
  3. Authentication: Header → Authorization: Bearer rfk_live_...
  4. Configure endpoints: /portcos, /scores, /compliance, /pipeline
  5. Set sync frequency (hourly recommended). Fivetran handles pagination via the meta.page field automatically.
💡
Use the since query parameter for incremental syncs. Fivetran can pass the last synced timestamp via a cursor variable.

Airbyte

Use Airbyte's HTTP Source (Low-Code CDK or custom connector) to sync from the REST API.

  1. In Airbyte, add a new source → "HTTP API" (or build a custom connector).
  2. Configure the base URL and authorization header with your API key.
  3. Define streams for each endpoint: portcos, kpis, scores, pipeline, compliance.
  4. Each stream uses cursor-based incremental sync on the updated_at field.
Airbyte YAML stream definition (Low-Code CDK)
streams:
  - name: portcos
    primary_key: id
    cursor_field: updated_at
    retriever:
      requester:
        url_base: https://rollforgeops.ai
        path: /api/v1/portcos
        http_method: GET
        authenticator:
          type: ApiKeyAuthenticator
          header: Authorization
          api_token: "Bearer {{ config['api_key'] }}"
      paginator:
        type: DefaultPaginator
        page_size_option:
          inject_into: request_parameter
          field_name: per_page
        pagination_strategy:
          type: PageIncrement
          page_size: 200

Ready to connect your data warehouse?

Generate an API key and configure your first export in under 10 minutes.

Open Integrations Dashboard →