Have an amazing solution built in RAD Studio? Let us know. Looking for discounts? Visit our Special Offers page!
News

API Limits with #FDEC

the future of starts demands massive productivity

API Limits with FireDAC Enterprise ConnectorsThe FireDAC Enterprise Connectors (#FDEC) by CData and Embarcadero make it really easy to work with various APIs just like you would any SQL database. For example if you want to publish the results of a query to a Google Sheet (which I find incredibly useful) then it is just a few FireDAC components and you are off to the races. You might run into an API limit though.

What is an API limit? Most rest services have a limit to how often a client can call a specific API within a certain amount of time. Google calls this their usage limit:

This version of the Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user. Limits for reads and writes are tracked separately. There is no daily usage limit.

That may seem like a lot, but I found I was running into that limit pretty quick once I moved my project into production. Luckily FireDAC and the FireDAC Enterprise Connectors have a simple workaround: Batch Processing.

Using the Array DML features of FireDAC you can batch multiple DML (Data Manipulation Language) operations into a single API call. The FDEC Google Sheets documentation from CData doesn’t cover Array DML, but the component supports this (they are updating the documentation). The Elasticsearch documentation does cover Batch Processing with an example, and I’ve used this with Sheets and it works great!

Bulk Insert

The following example prepares a single batch that inserts records in bulk.

FDConnection1.ResourceOptions.ServerOutput := True; FDQuery1.SQL.Text := ‘insert into Account values (:Name, :Id )’; FDQuery1.Params.ArraySize := 100; FDQuery1.Params[0].AsStrings[0]:= ‘MyName1’; FDQuery1.Params[1].AsStrings[0]:= ‘MyId1’; //next statement FDQuery1.Params[0].AsStrings[1]:= ‘MyName2’; FDQuery1.Params[1].AsStrings[1]:= ‘MyId2’; … FDQuery1.Execute(FDQuery1.Params.ArraySize); ShowMessage(IntToStr(FDQuery1.RowsAffected));

To retrieve the Ids of the new records, query the LastResultInfo#TEMP table:

sName := FDQuery1.Open(‘SELECT * FROM [LastResultInfo#TEMP]’);

Bulk Update

The following example prepares a single batch that inserts records in bulk.

FDQuery1.SQL.Text := ‘update Account set Name = :Name WHERE Id = :Id’; FDQuery1.Params.ArraySize := 100; FDQuery1.Params[0].AsStrings[0]:= ‘Floppy Disks’; FDQuery1.Params[1].AsStrings[0]:= ‘Id1’; //next statement FDQuery1.Params[0].AsStrings[1]:= ‘Jon Doe’; FDQuery1.Params[1].AsStrings[1]:= ‘Id2’; … FDQuery1.Execute(FDQuery.Params.ArraySize); ShowMessage(IntToStr(FDQuery1.RowsAffected));

Bulk Delete

The following example prepares a single batch that inserts records in bulk:

FDQuery1.SQL.Text := ‘delete Account where Id = :Id’; FDQuery1.Params.ArraySize := 100; FDQuery1.Params[0].AsStrings[0]:= ‘MyId1’; //next statement FDQuery1.Params[0].AsStrings[1]:= ‘MyId2’; … FDQuery1.Execute(FDQuery.Params.ArraySize); ShowMessage(IntToStr(FDQuery1.RowsAffected));

If you want to learn more about Array DML check out these videos:

Array DML Skill Sprint with Pawel Glowacki

 

FireDAC in Depth with Cary Jensen

 

Also check out Cary Jensen’s book on the topic of FireDAC in Depth.

See What's New in 12.2 Athens See What's New in 12.2 Athens Dev Days of Summer 2-24

Reduce development time and get to market faster with RAD Studio, Delphi, or C++Builder.
Design. Code. Compile. Deploy.
Start Free Trial   Upgrade Today

   Free Delphi Community Edition   Free C++Builder Community Edition

About author

Director of Delphi Consulting for GDK Software USA. Many software related patents, including swipe and pattern unlock and search engines. First Silver and Gold Delphi badges on Stack Overflow Former Developer Advocate for Embarcadero Technologies. Long time fan of programming, especially with Delphi. Author, Podcaster/YouTuber, Improvisor, Public Speaker, Father, and Friend.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

IN THE ARTICLES