Did you know it’s easy to make use of some truly excellent Python libraries to super-charge your Delphi app development on Windows? Adding Python to your toolbox of Delphi code can enhance your app development, bringing in new capabilities which help you provide innovative and powerful solutions to your app’s users, which combine the best of Python with the supreme low-code and unparalleled power of native Windows development you get with Delphi.
Are you looking for how to build a GUI for a powerful AI library? You can build a state-of-the-art deep learning solution with fastai on Delphi. This post will show you how to build a Delphi GUI app, dedicated to the fastai library.
Watch this video by Jim McKeeth, as a comprehensive introduction about why you can love both Delphi and Python at the same time:
Table of Contents
What is the fastai library?
fastai is a deep learning library that provides practitioners with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains and provides researchers with low-level components that can be mixed and matched to build new approaches. It aims to do both things without substantial compromises in ease of use, flexibility, or performance.
fastai includes:
- A new type dispatch system for Python along with a semantic type hierarchy for tensors
- A GPU-optimized computer vision library that can be extended in pure Python
- An optimizer which refactors out the common functionality of modern optimizers into two basic pieces, allowing optimization algorithms to be implemented in 4–5 lines of code
- A novel 2-way callback system that can access any part of the data, model, or optimizer and change it at any point during training
- A new data block API
- And much more…
And the best of it is, fastai is organized around two main design goals: to be approachable and rapidly productive, while also being deeply hackable and configurable.
fastai is built on top of a hierarchy of lower-level APIs which provide composable building blocks. This way, a user wanting to rewrite part of the high-level API or add particular behavior to suit their needs does not have to learn how to use the lowest level.
How do I install the fastai Library?
You can easily install fastai
with pip
:
Or, if you are using Anaconda Python distribution, you can use this command to avoid complexities and conflicts between required libraries:
How do I build a Delphi GUI for the fastai library?
The following is the user interface structure for our project:
Here is the list of Components used in the fastai4D
demo app:
TPythonEngine
TPythonModule
TPythonType
TPythonVersions
TPythonGUIInputOutput
TForm
TMemo
TOpenDialog
TSaveDialog
TSplitter
TImage
TPanel
TLabel
TComboBox
TButton
Navigate to the UnitFastai4D.pas
, and add the following line to the FormCreate
, to load our basic fastaiApp.py
:
And make sure that the fastaiApp.py
is in the same directory as our Fastai4D.exe
or inside your Delphi project folder.
You can change the “fastaiApp.py
” with any fastai
script you want, or you can load your fastai
scripts at runtime, by clicking the “Load script…
” like we will show you in the next Demo Sections.
How to perform Deep Learning using fastai in a Delphi app?
Highly recommended practice:
1. This GUI was created by modifying Python4Delphi Demo34
, which makes us possibly change the Python version in the runtime (this will save you from the seemingly complicated dll
issues).
2. Add “Jpeg
” to the Uses-list at the top of our UnitMatplotlib4D.pas
code. We have to do that, because otherwise, Delphi can not understand the JPG
format. With this correction, it should work.
After that, the above change should look similar to this:
And we can load JPG
images into our TImage
.
3. Set up these paths to your Environment Variable, for regular Python:
And set up these paths, if you use the Anaconda Python distribution:
4. To load the image dataset or to create plots, you will need to use Matplotlib in a way that is outside the “normal” command-line process. To do it, you will need to add these lines to all your Python code:
We strongly recommend you to name your image output as “fastaiImage.jpg
“, to enable it to load automatically on your GUI after clicking the ”Show plot
” button.
5. Set MaskFPUExceptions(True);
to the UnitFastai4D.pas
file, to avoid the Delphi raises an exception when floating operations result produce +/- infinity (e.g. division by zero
) that is caused by incompatible with a number of Python libraries such as NumPy
, SciPy
, pandas
, and Matplotlib
.
One of the best parts of this Fastai4D
Demo GUI is that you can choose the Python version you prefer, and it can be interchangeable.
As we’ve already tested this GUI for regular Python and Anaconda Python, this Fastai4D
GUI works better for regular Python distribution.
Next, click the “Execute
” button to run the very basic example to download, load the image dataset, and print out their labels (the Python code is already called inside the UnitFastai4D.pas
file), and click the “Show plot
” button to show the figure. Here is the output:
How to implement deep learning for image classification using fastai and Delphi?
Examples of implementations:
1. Load image datasets with their labels
In this fastai
embedded Python code, we are going to use the Oxford-IIIT Pet Dataset by O. M. Parkhi et al., 2012 which features 12
cat
breeds and 25
dog
breeds. Our model will learn how to differentiate between these 37
distinct categories. According to their paper, the best accuracy they could get in 2012 was 59.21%
, using a complex model that was specific to pet detection, with separate “Image
“, “Head
“, and “Body
” models for the pet photos.
Here is the default example embedded on the Fastai4D
GUI:
You can download the script here.
If you have low computational power, set up the batch size as 16
(by uncommenting this line: #bs = 16 ## Use this if you have low computational power
, on this fastaiApp.py code.
2. Train deep learning model
This example will train ResNet-34
. We will use a convolutional neural network backbone and a fully connected head with a single hidden layer as a classifier. ResNet-34
is a 34
layer convolutional neural network that can be utilized as a state-of-the-art image classification model.
Illustrated below, on the right-hand side is Resnet34
‘s architecture where the 34 layers and the residuals from one layer to another are visualized:
We are building a model which will take images as input and will output the predicted probability for each of the categories (in this case, it will have 37 outputs), and we will train for 4
epochs
(4 cycles through all our data).
Load the demo02_trainModel.py
at runtime by clicking the “Load script…
” button, and then “Execute
”. Here is the output:
The code in this section will train the model for 4 epochs (4 cycles through all our data) and save the result.
If you run this without a GPU, it will need a long time to run. The following is our test on a regular CORE i5 laptop, with an onboard graphic card, and 12 Gb of RAM:
3. What do the results of using fastai look like?
If you completed the deep learning model training, you would see the following results:
The result will show you the categories that the model most confused with one another. We will identify if what the model predicted was reasonable or not. In this case, the mistakes look reasonable (none of the mistakes seems obviously naive). This is an indicator that our classifier is working correctly.
The result inside Python4Delphi GUI:
Furthermore, we want to plot the confusion matrix. From the confusion matrix result, we can see that the distribution is heavily skewed: The model makes the same mistakes over and over again, but it rarely confuses other categories. This suggests that it just finds it difficult to distinguish some specific categories between each other; this is a normal behavior.
The confusion matrix inside Python4Delphi GUI:
Let’s print out the most confused result, until the minimum value 2
:
Since our model is working as we expect it to, we will unfreeze our model and train some more, and here are the results:
Plot the learning rate:
Fitting for two cycles:
Such a powerful deep learning library, right?
Visit this repository [4] for the complete source code for this Fastai4Delphi project, and this reference [3] for the original notebook and more comprehensive concepts about fastai for image classification.
Are you ready to try this excellent Fastai4Delphi
example?
Congratulations, now you have learned a lot about fastai
: A state-of-the-art deep learning library, and how you can use Delphi to create a simple yet powerful GUI for it! We have learned the fundamentals of deep learning to perform image classification, and now you can explore them to boost your productivity in creating your own deep learning apps.
If you are looking for other powerful AI libraries, please read this article:
https://pythongui.org/learn-to-build-a-gui-for-these-10-ultimate-python-ai-libraries/
Download a free trial of RAD Studio Delphi today and try out these examples for yourself.
References & Further Readings
[1] CS231n. (2022). Convolutional Neural Networks (CNNs/ConvNets). Stanford. cs231n.github.io/convolutional-networks
[2] He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770-778).
[3] Howard, J. (2022). Lesson 1 – What’s your pet. fast.ai repo GitHub. github.com/fastai/fastai/blob/master/dev_nbs/course/lesson1-pets.ipynb
[4] Hakim, M. A. (2022). Article18 – Fastai4D Demo. embarcaderoBlog-repo GitHub. github.com/MuhammadAzizulHakim/embarcaderoBlog-repo/tree/main/Article18%20-%20Fastai4D%20Demo