Setting the batch size via Bulk API
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
1
down vote
favorite
I'm using the Bulk API to make updates to ~50k records. I'm splitting those up into jobs of about 5k apiece, but I was hoping there'd be a way to control the batch size similar to that of using Dataloader (ie: you can tell it to process in chunks of 200 records at a time, or 1 record at a time).
Is this configurable? Or if not is there a different way to achieve this? The main issue is that some of the jobs fail due to a record triggering too many workflows that bog down the execution time, so I'd rather try and isolate those and use smaller batch sizes.
api bulk-api
add a comment |Â
up vote
1
down vote
favorite
I'm using the Bulk API to make updates to ~50k records. I'm splitting those up into jobs of about 5k apiece, but I was hoping there'd be a way to control the batch size similar to that of using Dataloader (ie: you can tell it to process in chunks of 200 records at a time, or 1 record at a time).
Is this configurable? Or if not is there a different way to achieve this? The main issue is that some of the jobs fail due to a record triggering too many workflows that bog down the execution time, so I'd rather try and isolate those and use smaller batch sizes.
api bulk-api
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I'm using the Bulk API to make updates to ~50k records. I'm splitting those up into jobs of about 5k apiece, but I was hoping there'd be a way to control the batch size similar to that of using Dataloader (ie: you can tell it to process in chunks of 200 records at a time, or 1 record at a time).
Is this configurable? Or if not is there a different way to achieve this? The main issue is that some of the jobs fail due to a record triggering too many workflows that bog down the execution time, so I'd rather try and isolate those and use smaller batch sizes.
api bulk-api
I'm using the Bulk API to make updates to ~50k records. I'm splitting those up into jobs of about 5k apiece, but I was hoping there'd be a way to control the batch size similar to that of using Dataloader (ie: you can tell it to process in chunks of 200 records at a time, or 1 record at a time).
Is this configurable? Or if not is there a different way to achieve this? The main issue is that some of the jobs fail due to a record triggering too many workflows that bog down the execution time, so I'd rather try and isolate those and use smaller batch sizes.
api bulk-api
api bulk-api
asked 3 hours ago
Matt
525413
525413
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
accepted
The Bulk API is intended to load large numbers of records in parallel asynchronously. It also has really small daily limits. You could technically set up 10,000 batches of 1 record each, but then you'd hit your daily limit. Use the normal synchronous API instead. If you need batches of smaller than about 1,000 records, the Bulk API is not for you. Please note that, unlike the synchronous API, each batch has a limit of 10 minutes (with retries for additional chunks), and each chunk of 200 records has 5 minutes, instead of the usual smaller time limits associated with synchronous updates.
With the synchronous API, the batch size is how many records are submitted at once per API call. In the Bulk API, the batch size is how many records are submitted in a file, which is termed a batch. The batch is then broken down in to chunks of 100/200 records each (depending on API version). As long as each chunk runs in less than 5 minutes, you should be okay. If you're not able to get decent performance with values smaller than 1,000 or so, it's simply going to be too "expensive" in terms of daily limits to use this API.
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
1
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
The Bulk API is intended to load large numbers of records in parallel asynchronously. It also has really small daily limits. You could technically set up 10,000 batches of 1 record each, but then you'd hit your daily limit. Use the normal synchronous API instead. If you need batches of smaller than about 1,000 records, the Bulk API is not for you. Please note that, unlike the synchronous API, each batch has a limit of 10 minutes (with retries for additional chunks), and each chunk of 200 records has 5 minutes, instead of the usual smaller time limits associated with synchronous updates.
With the synchronous API, the batch size is how many records are submitted at once per API call. In the Bulk API, the batch size is how many records are submitted in a file, which is termed a batch. The batch is then broken down in to chunks of 100/200 records each (depending on API version). As long as each chunk runs in less than 5 minutes, you should be okay. If you're not able to get decent performance with values smaller than 1,000 or so, it's simply going to be too "expensive" in terms of daily limits to use this API.
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
1
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
add a comment |Â
up vote
2
down vote
accepted
The Bulk API is intended to load large numbers of records in parallel asynchronously. It also has really small daily limits. You could technically set up 10,000 batches of 1 record each, but then you'd hit your daily limit. Use the normal synchronous API instead. If you need batches of smaller than about 1,000 records, the Bulk API is not for you. Please note that, unlike the synchronous API, each batch has a limit of 10 minutes (with retries for additional chunks), and each chunk of 200 records has 5 minutes, instead of the usual smaller time limits associated with synchronous updates.
With the synchronous API, the batch size is how many records are submitted at once per API call. In the Bulk API, the batch size is how many records are submitted in a file, which is termed a batch. The batch is then broken down in to chunks of 100/200 records each (depending on API version). As long as each chunk runs in less than 5 minutes, you should be okay. If you're not able to get decent performance with values smaller than 1,000 or so, it's simply going to be too "expensive" in terms of daily limits to use this API.
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
1
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
The Bulk API is intended to load large numbers of records in parallel asynchronously. It also has really small daily limits. You could technically set up 10,000 batches of 1 record each, but then you'd hit your daily limit. Use the normal synchronous API instead. If you need batches of smaller than about 1,000 records, the Bulk API is not for you. Please note that, unlike the synchronous API, each batch has a limit of 10 minutes (with retries for additional chunks), and each chunk of 200 records has 5 minutes, instead of the usual smaller time limits associated with synchronous updates.
With the synchronous API, the batch size is how many records are submitted at once per API call. In the Bulk API, the batch size is how many records are submitted in a file, which is termed a batch. The batch is then broken down in to chunks of 100/200 records each (depending on API version). As long as each chunk runs in less than 5 minutes, you should be okay. If you're not able to get decent performance with values smaller than 1,000 or so, it's simply going to be too "expensive" in terms of daily limits to use this API.
The Bulk API is intended to load large numbers of records in parallel asynchronously. It also has really small daily limits. You could technically set up 10,000 batches of 1 record each, but then you'd hit your daily limit. Use the normal synchronous API instead. If you need batches of smaller than about 1,000 records, the Bulk API is not for you. Please note that, unlike the synchronous API, each batch has a limit of 10 minutes (with retries for additional chunks), and each chunk of 200 records has 5 minutes, instead of the usual smaller time limits associated with synchronous updates.
With the synchronous API, the batch size is how many records are submitted at once per API call. In the Bulk API, the batch size is how many records are submitted in a file, which is termed a batch. The batch is then broken down in to chunks of 100/200 records each (depending on API version). As long as each chunk runs in less than 5 minutes, you should be okay. If you're not able to get decent performance with values smaller than 1,000 or so, it's simply going to be too "expensive" in terms of daily limits to use this API.
edited 2 hours ago
answered 2 hours ago
sfdcfox
232k10179395
232k10179395
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
1
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
add a comment |Â
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
1
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
I think I understand, but want to confirm: essentially Bulk isn't going to work if I want to do very small batches, and I should instead take those records and use something like this: developer.salesforce.com/docs/atlas.en-us.216.0.api_rest.meta/â¦
â Matt
28 mins ago
1
1
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
@Matt or the SOAP API. Data Loader in non-bulk mode uses this API if you don't want to roll your own solution.
â sfdcfox
26 mins ago
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f235896%2fsetting-the-batch-size-via-bulk-api%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password