How to Sync QuickBooks Desktop with Salesforce

Learn how to sync QuickBooks Desktop or QuickBooks Enterprise with Salesforce using Conductor on the QuickBooks side and Salesforce's REST, Composite, and Bulk APIs on the CRM side.

If you want to sync QuickBooks Desktop with Salesforce, the cleanest approach is:

  • use Conductor for the QuickBooks Desktop side

  • use Salesforce APIs for the Salesforce side

  • keep your own backend as the mapping and orchestration layer

That is the practical architecture for most teams.

Quick answers

  • Can I sync QuickBooks Enterprise with Salesforce the same way? Yes. Enterprise uses the same core Desktop integration stack.

  • Which Salesforce APIs are most useful? Usually REST API first, Composite API for grouped writes, and Bulk API 2.0 for large backfills.

  • Should I build direct SOAP + qbXML integration to QuickBooks? Usually no, unless you intentionally want to own the legacy Desktop stack.

  • What is the most important design choice? Define record ownership and external IDs before you sync anything.

The overall pattern

At a high level, the sync loop looks like this:

  1. Read changes from Salesforce or QuickBooks Desktop.

  2. Normalize them into your app's integration model.

  3. Write the destination-side changes.

  4. Store both system IDs and sync state.

  5. Surface human-readable errors when the Desktop connection is not healthy.

That sounds obvious, but it is exactly where many integrations go wrong. Teams often focus on API calls and skip the durable mapping layer that prevents duplicate records and confusing partial syncs.

Decide the first workflow before you write code

Do not start with "sync everything."

Start with one workflow such as:

  • Salesforce Account -> QuickBooks customer

  • Salesforce Opportunity -> QuickBooks invoice

  • QuickBooks invoice -> Salesforce custom object

  • QuickBooks payment state -> Salesforce account or finance view

Then define:

  • the source system of record

  • the destination object

  • the external ID field

  • the sync trigger

  • the conflict rule

Those decisions matter more than the first HTTP request.

The APIs to use

On the QuickBooks Desktop side

Use Conductor.

That gives you:

  • a modern QuickBooks Desktop API

  • health checks for real connection readiness

  • typed SDKs

  • auth and setup flow for the QuickBooks machine

  • normalized error handling for QuickBooks and Web Connector failures

On the Salesforce side

Use Salesforce's normal platform APIs:

  • REST API for most reads and writes

  • SOQL query endpoint for pulling changed records

  • External ID upsert endpoints for idempotent writes

  • Composite API when you want fewer round trips for related operations

  • Bulk API 2.0 for initial data loads or large catch-up syncs

That is usually enough to build a robust integration.

Recommended data model

Your app should store at least:

  • conductor_end_user_id for the QuickBooks company-file connection

  • Salesforce record IDs

  • QuickBooks record IDs

  • the last sync cursor or timestamp

  • sync status and last error details

Do not rely on display names alone. Store actual IDs and external IDs so updates are deterministic.

Use external IDs on the Salesforce side

This is one of the most important implementation details.

If you want repeated syncs to be safe, define External ID fields in Salesforce for the QuickBooks-side records you mirror there.

Examples:

  • QuickBooks_Customer_Id__c

  • QuickBooks_Invoice_Id__c

  • QuickBooks_Item_Id__c

That lets you upsert safely instead of asking "does this record already exist?" on every write.

Example: sync QuickBooks invoices into Salesforce

One common pattern is to mirror QuickBooks invoices into a Salesforce custom object so sales, success, or finance teams can see accounting status inside Salesforce.

The example below does three things:

  1. checks the QuickBooks Desktop connection through Conductor

  2. fetches recently updated invoices from QuickBooks Desktop

  3. upserts them into a Salesforce custom object using an External ID

import Conductor from "conductor-node";

const conductor = new Conductor({
  apiKey: process.env.CONDUCTOR_SECRET_KEY!,
});

const conductorEndUserId = process.env.CONDUCTOR_END_USER_ID!;
const salesforceInstanceUrl = process.env.SALESFORCE_INSTANCE_URL!;
const salesforceAccessToken = process.env.SALESFORCE_ACCESS_TOKEN!;

async function syncRecentInvoicesToSalesforce() {
  await conductor.qbd.healthCheck({ conductorEndUserId });

  const invoices = await conductor.qbd.invoices.list({
    conductorEndUserId,
    updatedAfter: "2026-01-01T00:00:00Z",
    limit: 100,
  });

  for (const invoice of invoices.data) {
    const response = await fetch(
      `${salesforceInstanceUrl}/services/data/v61.0/sobjects/QuickBooks_Invoice__c/QuickBooks_Invoice_Id__c/${encodeURIComponent(invoice.id)}`,
      {
        method: "PATCH",
        headers: {
          Authorization: `Bearer ${salesforceAccessToken}`,
          "Content-Type": "application/json",
        },
        body: JSON.stringify({
          Name: invoice.refNumber ?? `Invoice ${invoice.id}`,
          QuickBooks_Invoice_Id__c: invoice.id,
          Total_Amount__c: invoice.totalAmount,
          QuickBooks_Updated_At__c: invoice.updatedAt,
        }),
      },
    );

    if (!response.ok && response.status !== 201 && response.status !== 204) {
      throw new Error(
        `Salesforce upsert failed for invoice ${invoice.id}: ${response.status}`,
      );
    }
  }
}

syncRecentInvoicesToSalesforce().catch(console.error);
import Conductor from "conductor-node";

const conductor = new Conductor({
  apiKey: process.env.CONDUCTOR_SECRET_KEY!,
});

const conductorEndUserId = process.env.CONDUCTOR_END_USER_ID!;
const salesforceInstanceUrl = process.env.SALESFORCE_INSTANCE_URL!;
const salesforceAccessToken = process.env.SALESFORCE_ACCESS_TOKEN!;

async function syncRecentInvoicesToSalesforce() {
  await conductor.qbd.healthCheck({ conductorEndUserId });

  const invoices = await conductor.qbd.invoices.list({
    conductorEndUserId,
    updatedAfter: "2026-01-01T00:00:00Z",
    limit: 100,
  });

  for (const invoice of invoices.data) {
    const response = await fetch(
      `${salesforceInstanceUrl}/services/data/v61.0/sobjects/QuickBooks_Invoice__c/QuickBooks_Invoice_Id__c/${encodeURIComponent(invoice.id)}`,
      {
        method: "PATCH",
        headers: {
          Authorization: `Bearer ${salesforceAccessToken}`,
          "Content-Type": "application/json",
        },
        body: JSON.stringify({
          Name: invoice.refNumber ?? `Invoice ${invoice.id}`,
          QuickBooks_Invoice_Id__c: invoice.id,
          Total_Amount__c: invoice.totalAmount,
          QuickBooks_Updated_At__c: invoice.updatedAt,
        }),
      },
    );

    if (!response.ok && response.status !== 201 && response.status !== 204) {
      throw new Error(
        `Salesforce upsert failed for invoice ${invoice.id}: ${response.status}`,
      );
    }
  }
}

syncRecentInvoicesToSalesforce().catch(console.error);
import Conductor from "conductor-node";

const conductor = new Conductor({
  apiKey: process.env.CONDUCTOR_SECRET_KEY!,
});

const conductorEndUserId = process.env.CONDUCTOR_END_USER_ID!;
const salesforceInstanceUrl = process.env.SALESFORCE_INSTANCE_URL!;
const salesforceAccessToken = process.env.SALESFORCE_ACCESS_TOKEN!;

async function syncRecentInvoicesToSalesforce() {
  await conductor.qbd.healthCheck({ conductorEndUserId });

  const invoices = await conductor.qbd.invoices.list({
    conductorEndUserId,
    updatedAfter: "2026-01-01T00:00:00Z",
    limit: 100,
  });

  for (const invoice of invoices.data) {
    const response = await fetch(
      `${salesforceInstanceUrl}/services/data/v61.0/sobjects/QuickBooks_Invoice__c/QuickBooks_Invoice_Id__c/${encodeURIComponent(invoice.id)}`,
      {
        method: "PATCH",
        headers: {
          Authorization: `Bearer ${salesforceAccessToken}`,
          "Content-Type": "application/json",
        },
        body: JSON.stringify({
          Name: invoice.refNumber ?? `Invoice ${invoice.id}`,
          QuickBooks_Invoice_Id__c: invoice.id,
          Total_Amount__c: invoice.totalAmount,
          QuickBooks_Updated_At__c: invoice.updatedAt,
        }),
      },
    );

    if (!response.ok && response.status !== 201 && response.status !== 204) {
      throw new Error(
        `Salesforce upsert failed for invoice ${invoice.id}: ${response.status}`,
      );
    }
  }
}

syncRecentInvoicesToSalesforce().catch(console.error);

This example assumes you created a Salesforce custom object named QuickBooks_Invoice__c with an External ID field named QuickBooks_Invoice_Id__c.

That is usually a better starting point than forcing accounting records into unrelated standard CRM objects.

Example: read candidate source data from Salesforce

On the Salesforce side, a typical read pattern is:

  • query the records changed since your last cursor

  • map the fields into your accounting model

  • send the resulting write to QuickBooks Desktop through Conductor

For example, you might query Accounts that changed recently and map them into QuickBooks customers.

const soql = [
  "SELECT Id, Name, BillingStreet, BillingCity, BillingState,",
  "BillingPostalCode, BillingCountry, LastModifiedDate",
  "FROM Account",
  "WHERE LastModifiedDate = LAST_N_DAYS:7",
].join(" ");

const queryUrl = `${salesforceInstanceUrl}/services/data/v61.0/query?q=${encodeURIComponent(soql)}`;

const response = await fetch(queryUrl, {
  headers: {
    Authorization: `Bearer ${salesforceAccessToken}`,
  },
});

if (!response.ok) {
  throw new Error(`Salesforce query failed: ${response.status}`);
}

const accountResults = await response.json();
console.log(accountResults.records);
const soql = [
  "SELECT Id, Name, BillingStreet, BillingCity, BillingState,",
  "BillingPostalCode, BillingCountry, LastModifiedDate",
  "FROM Account",
  "WHERE LastModifiedDate = LAST_N_DAYS:7",
].join(" ");

const queryUrl = `${salesforceInstanceUrl}/services/data/v61.0/query?q=${encodeURIComponent(soql)}`;

const response = await fetch(queryUrl, {
  headers: {
    Authorization: `Bearer ${salesforceAccessToken}`,
  },
});

if (!response.ok) {
  throw new Error(`Salesforce query failed: ${response.status}`);
}

const accountResults = await response.json();
console.log(accountResults.records);
const soql = [
  "SELECT Id, Name, BillingStreet, BillingCity, BillingState,",
  "BillingPostalCode, BillingCountry, LastModifiedDate",
  "FROM Account",
  "WHERE LastModifiedDate = LAST_N_DAYS:7",
].join(" ");

const queryUrl = `${salesforceInstanceUrl}/services/data/v61.0/query?q=${encodeURIComponent(soql)}`;

const response = await fetch(queryUrl, {
  headers: {
    Authorization: `Bearer ${salesforceAccessToken}`,
  },
});

if (!response.ok) {
  throw new Error(`Salesforce query failed: ${response.status}`);
}

const accountResults = await response.json();
console.log(accountResults.records);

The exact QuickBooks write that follows depends on your object model and workflow. The important point is the pattern:

  • query Salesforce cleanly

  • map intentionally

  • write through Conductor

  • store both IDs

Use Composite API when writes are related

If one logical sync action needs several Salesforce writes, for example:

  • update an Account

  • create or upsert a QuickBooks_Invoice__c

  • update a sync-log object

then Salesforce Composite API can reduce round trips and keep the write grouped more cleanly.

That is usually not necessary for your very first sync, but it is a good optimization once the base workflow is stable.

Use Bulk API 2.0 for backfills, not your first prototype

If you need to load thousands of historical records into Salesforce, Bulk API 2.0 is the right tool.

But it is usually a mistake to begin there.

Start with the single-record or small-batch path first. Once the mapping and sync-state logic are stable, add Bulk API 2.0 for:

  • historical invoice imports

  • customer backfills

  • catch-up jobs after a long outage

Recommended sync UX

For QuickBooks Desktop, a user-triggered sync usually works best.

Good examples:

  • Sync with QuickBooks

  • Push to QuickBooks

  • Refresh accounting status

That works better than pretending the sync is always invisible and always online, because the user who can fix a blocked QuickBooks prompt is often the person sitting at the machine that owns the company file.

Common mistakes

Avoid these early mistakes:

  • writing two-way sync before you define system of record

  • skipping external IDs

  • mapping Salesforce Opportunities directly to accounting transactions without business review

  • installing the QuickBooks connection on the wrong Windows machine

  • assuming a Web Connector heartbeat proves the connection is healthy

When to use a custom Salesforce object

A custom object is often the best choice when you want Salesforce users to see QuickBooks state without forcing that data into standard CRM objects.

Good examples:

  • mirrored QuickBooks invoices

  • payment status snapshots

  • sync audit rows

  • accounting summaries tied to Accounts or Opportunities

This keeps the sync explicit and avoids overloading core CRM fields with accounting semantics they were not designed for.

Frequently asked questions

Can I sync QuickBooks Enterprise with Salesforce too?

Yes. QuickBooks Enterprise is still a QuickBooks Desktop environment, so the same approach applies.

Should I use Salesforce standard objects or custom objects?

Usually both. Use standard objects where the business meaning is real, such as Account, and use custom objects where you are mirroring accounting records like invoices or payment status.

Do I need Bulk API 2.0 right away?

Usually no. Start with REST API and External ID upserts. Add Bulk API 2.0 when you need larger backfills.

Do I need QuickBooks Web Connector for this architecture?

Usually yes, on the QuickBooks Desktop side. Conductor handles that layer so your app does not have to build directly on the native Desktop stack.

Bottom line

To sync QuickBooks Desktop with Salesforce, the practical stack is:

  • Conductor for QuickBooks Desktop

  • Salesforce REST, Composite, and Bulk APIs where appropriate

  • your own backend for mapping, sync state, and business rules

That gives you the best division of responsibility.

Related reading