In this part of this series, you will try to create a Spring Boot application from scratch using an AI coding assistant. The goal is not to just merely create a working application, but to create production-grade code. Enjoy!

1. Introduction

Nowadays, many AI coding assistants are available. These are demonstrated during conferences, in videos, described in blogs, etc. The demos are often impressive and it seems that AI is able to generate almost all of the source code for you. You only need to review it. However, when you start using AI coding assistants at work, it just seems that it is not working for you and it only costs you more time. The truth lies somewhere in between. AI coding assistants can save you a lot of time for certain tasks, but they also have some limitations. It is important to learn which tasks will help you and how to recognize when you hit the limits of AI. Beware that AI is evolving in a fast pace, so the limitations of today may be resolved in the near future.

In the remainder of this blog, some tasks are executed with the help of an AI coding assistant. The responses are evaluated and different techniques are applied which can be used to improve the responses when necessary. This blog is part of a series, the previous parts can be read here:

In this part, you will try to generate a Spring Boot application by means of using an AI coding assistant. However, some preconditions apply to this Spring Boot application. The most important precondition is that the generated code must be production-grade. Just generating working software is not enough.

The tasks are executed with the IntelliJ IDEA DevoxxGenie AI coding assistant.

The setup used in this blog is LMStudio as inference engine and qwen2.5-coder:7b as model. This runs on GPU.

As you can see, local running models are used. Reason for doing so is that you will hit the limits of a model earlier.

The sources used in this blog are available at GitHub. The explanation of how the project is created, can be found here.

2. Prerequisites

Prerequisites for reading this blog are:

  • Basic coding knowledge;
  • Basic knowledge of AI coding assistants;
  • Basic knowledge of DevoxxGenie, for more information you can read a previous blog or watch the conference talk given at Devoxx.

3. Create Skeleton

First, some skeleton needs to be created because the Spring Boot application must meet some requirements.

  • The Rest API must be defined by means of an OpenAPI specification;
  • The controller interface must be generated by means of the openapi-generator-maven-plugin;
  • PostgreSQL must be used as database;
  • Liquibase must be used to create the database tables;
  • jOOQ must be used to access the database;
  • The jOOQ classes must be generated by means of the testcontainers-jooq-codegen-maven-plugin.

Navigate to Spring Initializr and add the following dependencies:

  • Spring Web
  • PostgreSQL Driver
  • JOOQ Access Layer
  • Validation
  • Liquibase Migration

The following changes are applied to the generated Spring Boot application:

  • The controller interface must be generated based on the OpenAPI specification, add plugin openapi-generator-maven-plugin and dependency swagger-annotations;
  • Add scope runtime to dependency liquibase-core;
  • Add a file db.changelog-root.xml to src/main/resources/db/changelog/db.changelog-root.xml as root file for the Liquibase migration scripts;
  • The jOOQ classes should be generated, add plugin testcontainers-jooq-codegen-maven-plugin;
  • Remove the test from the test sources.

The changes are applied to branch feature/base-repository.

Run the build.

$ mvn clean verify

The build fails because the OpenAPI specification is missing. However, this is the starting point.

4. Generate OpenAPI Specification

The build fails on the OpenAPI specification which is missing. So, let’s fix this.

4.1 Prompt

Enter the prompt.

Generate an OpenAPI specification version 3.1.1. 
The spec should contain CRUD methods for customers. 
The customers have a first name and a last name. 
Use the Zalando restful api guidelines.

4.2 Response

The response can be viewed here.

4.3 Apply Response

Add the response to file src/main/resources/static/customers.yaml.

Additionally, change the following.

  • Change the identifiers from strings to integers;
  • Add the identifier to the Customer schema, this way the identifier will be returned in the responses.
components:
  schemas:
    Customer:
      type: object
      properties:
        id:
          type: integer
          format: int64
        firstName:
          type: string
        lastName:
          type: string

Run the build. The build is successful and in directory target/generated-sources/openapi the generated sources are available.

The build shows some warnings, but these can be fixed by adding an operationId to the OpenAPI specification. For now, the warnings are ignored.

[WARNING] Empty operationId found for path: GET /customers. Renamed to auto-generated operationId: customersGET
[WARNING] Empty operationId found for path: POST /customers. Renamed to auto-generated operationId: customersPOST
[WARNING] Empty operationId found for path: GET /customers/{id}. Renamed to auto-generated operationId: customersIdGET
[WARNING] Empty operationId found for path: PUT /customers/{id}. Renamed to auto-generated operationId: customersIdPUT
[WARNING] Empty operationId found for path: DELETE /customers/{id}. Renamed to auto-generated operationId: customersIdDELETE

The changes can be viewed here.

5. Generate Liquibase Scripts

In this section, the Liquibase scripts will be generated.

5.1 Prompt

Open the OpenAPI spec and enter the prompt.

Based on this openapi spec, generate liquibase migration scripts in XML format

5.2 Response

The response can be viewed here.

5.3 Apply Response

The generated XML Liquibase script is entirely correct. Create in directory src/main/resources/db/changelog/migration a file db.changelog-1.xml and copy the response into it. Besides that, change the author to mydeveloperplanet.

<changeSet id="1" author="mydeveloperplanet">

Run the build.

The build log shows that the tables are generated.

Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Reading resource: src/main/resources/db/changelog/migration/db.changelog-1.xml
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Creating database history table with name: public.databasechangelog
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Reading from public.databasechangelog
Feb 23, 2025 12:42:59 PM liquibase.command
INFO: Using deploymentId: 0310979670
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Reading from public.databasechangelog
Running Changeset: src/main/resources/db/changelog/migration/db.changelog-1.xml::1::yourname
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Table customers created
Feb 23, 2025 12:42:59 PM liquibase.changelog

In directory target/generated-sources/jooq you can also find the generated jOOQ files which are generated.

The changes can be viewed here.

6. Generate Domain Model

In this section, the domain model will be generated.

6.1 Prompt

Open the Liquibase migration script and enter the prompt.

Create a domain model based on this liquibase migration script

6.2 Response

The response can be viewed here.

6.3 Apply Response

Create class Customer in package com.mydeveloperplanet.myaicodeprojectplanet.model. This clashes with the OpenAPI domain model package.

Change the following lines in the pom.xml:

<packageName>com.mydeveloperplanet.myaicodeprojectplanet</packageName>
<apiPackage>com.mydeveloperplanet.myaicodeprojectplanet.api</apiPackage>
<modelPackage>com.mydeveloperplanet.myaicodeprojectplanet.model</modelPackage>

Into:

<packageName>com.mydeveloperplanet.myaicodeprojectplanet.openapi</packageName>
<apiPackage>com.mydeveloperplanet.myaicodeprojectplanet.openapi.api</apiPackage>
<modelPackage>com.mydeveloperplanet.myaicodeprojectplanet.openapi.model</modelPackage>

Run the build, the build is successful.

The changes can be viewed here.

7. Generate Repository

In this section, the repository will be generated.

7.1 Prompt

Add the full project and also add the generated jOOQ classes from the directory target/generated-sources/jooq to the Prompt Context. Note: later it seemed that DevoxxGenie did not add these files at all because they were ignored in the .gitignore file, see this issue.

Generate a CustomerRepository in order that the operations defined in the openapi spec customers.yaml are supported

7.2 Response

The response can be viewed here.

The response uses Spring Data JPA and this is not what is wanted.

7.3 Prompt

Give explicit instructions to only use dependencies in the pom.xml.

Generate a CustomerRepository in order that the operations defined in the openapi spec customers.yaml are supported.
Only use dependencies available in the pom.xml

7.4 Response

The response can be viewed here.

The same response is returned. The instructions are ignored.

7.5 Prompt

Let’s be more specific. Enter the prompt.

You do not use the dependencies defined in the pom.
You should use jooq instead of jpa

7.6 Response

The response can be viewed here.

7.7 Apply Response

This response looks great. Even an example of how to use it in a Service is added.

Create a package com/mydeveloperplanet/myaicodeprojectplanet/repository and paste the CustomerRepository code. Some issues are present:

  • A RuntimeException is thrown when a Customer cannot be found. Probably this needs to be changed, but for the moment this will do.
  • The package com.mydeveloperplanet.myaicodeprojectplanet.jooq could not be found. A Maven Sync solved this issue.
  • Customers.CUSTOMER could not be found. The following line needed to be added import com.mydeveloperplanet.myaicodeprojectplanet.jooq.tables.Customers;
  • Still two compile errors remain due to a non-existing exists() method.
if (dslContext.selectFrom(Customers.CUSTOMERS)
              .where(Customers.CUSTOMERS.ID.eq(id))
              .exists())

7.8 Prompt

Open a new chat window. Open the CustomerRepository file and enter the prompt.

the .exists() method does not seem to be available, fix the code

7.9 Response

The response can be viewed here.

7.10 Apply Response

The response suggests to use the selectExists method, but also this method is non-existing.

7.11 Prompt

Enter the follow-up prompt.

the selectExists method is also not available, fix the code properly

7.12 Response

The response can be viewed here.

7.13 Apply Response

The response suggests to use the fetchExists method. This is already closer to the real solution, but still does not compile. The LLM suggests to use:

boolean exists = dslContext.selectFrom(Customers.CUSTOMERS)
                           .where(Customers.CUSTOMERS.ID.eq(id))
                           .fetchExists();

Time to help a little bit and change it manually to the correct implementation.

boolean exists = dslContext.fetchExists(dslContext.selectFrom(Customers.CUSTOMERS));

Run the build, the build is successful.

7.14 Prompt

In the current implementation, the repository methods use the jOOQ generated CustomersRecord as an argument. This means that the service layer would need to know about the repository layer and this is not wanted. The service layer should know only the domain model.

Open a new chat window, add the src directory to the Prompt Context and also the generated jOOQ classes from the directory target/generated-sources/jooq. Enter the prompt.

I want that the methods make use of the Customer model and that any mappings 
between Customer and CustomerRecord are done in the CustomerRepository itself

7.15 Response

The response can be viewed here.

7.16 Apply Response

This looks great. Some variables, arguments are called record and this is a reserved word. Change this in customerRecord.

Run the build, the build is successful.

This took a bit longer than the previous generations, but the end result is quite good.

The changes can be viewed here.

8. Generate Service

In this section, the service will be generated.

8.1 Prompt

Open a new chat window and add the full project to the Prompt Context. Enter the prompt.

Create a spring service in order that the operations defined in the openapi spec customers.yaml are supported. 
The service must use the CustomerRepository.

8.2 Response

The response can be viewed here.

8.3 Apply Response

The response looks good. Also a Service Interface is created, which is not really needed.

Create package com/mydeveloperplanet/myaicodeprojectplanet/service and add the Service class and interface.

Run the build, the build is successful.

The changes can be viewed here.

9. Generate Controller

In this section, the controller will be generated.

9.1 Prompt

Open a new chat window, add the src directory and the target/generated-sources/openapi/src directory to the Prompt Context. Enter the prompt.

Create a Spring Controller in order that the operations defined in the openapi spec customers.yaml are supported. 
The controller must implement CustomersApi. 
The controller must use the CustomersService.

9.2 Response

The response can be viewed here.

9.3 Apply Response

The response looks good. Create the package com/mydeveloperplanet/myaicodeprojectplanet/controller and add the controller to this package. Some issues exist:

  • The import for CustomersApi is missing, add it.
  • The arguments in the methods use the Customer domain model which is not correct. It should use the OpenAPI Customer model.

9.4 Prompt

Enter a follow-up prompt.

The interface is not correctly implemented. 
The interface must use the openapi Customer model and 
must convert it to the Customer domain model which is used by the service.

9.5 Response

The response can be viewed here.

9.6 Apply Response

The response is not correct. The LLM does not seem to see the difference between the Customer domain model and the Customer OpenAPI model.

There is also a strange non-existing Java syntax in the response (looks more like Python).

import com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer as OpenAPICustomer;

9.7 Prompt

Enter a follow-up prompt.

This is not correct. 
Try again, the openai Customer model is available in package com.mydeveloperplanet.myaicodeprojectplanet.openapi.model, 
the domain model is available in package com.mydeveloperplanet.myaicodeprojectplanet.model

9.8 Response

The response can be viewed here.

This response is identical to the previous one.

Let’s help the LLM a little bit. Fix the methods and replace OpenAPICustomer with com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer.

This still raises compile errors, but maybe the LLM can fix this.

9.9 Prompt

Open a new chat window, add the src directory and the target/generated-sources/openapi/src directory to the Prompt Context. Enter the prompt.

The CustomersController has the following compile errors:
* customersGet return value is not correct
* customersIdGet return value is not correct
Fix this

9.10 Response

The response can be viewed here.

9.10 Apply Response

This seems to be a better solution. Only the following snippet does not compile.

private com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer convertToOpenAPIModel(Customer customer) {
    return new com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer(
            customer.getId(),
            customer.getFirstName(),
            customer.getLastName()
    );
}

Let’s fix this manually.

private com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer convertToOpenAPIModel(Customer customer) {
    com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer openAPICustomer = 
            new com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer();
    openAPICustomer.setId(customer.getId());
    openAPICustomer.setFirstName(customer.getFirstName());
    openAPICustomer.setLastName(customer.getLastName());
    return openAPICustomer;
}

Run the build, the build is successful.

The changes can be viewed here.

10. Run Application

Time to run the application.

Add the following dependency in order to start a PostgreSQL database when running the application.

<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-docker-compose</artifactId>
	<scope>runtime</scope>
	<optional>true</optional>
</dependency>

Add a compose.yaml file to the root of the repository.

services:
  postgres:
    image: 'postgres:17-alpine'
    environment:
      - 'POSTGRES_DB=mydatabase'
      - 'POSTGRES_PASSWORD=secret'
      - 'POSTGRES_USER=myuser'
    labels:
      - "org.springframework.boot.service-connection=postgres"
    ports:
      - '5432'

Run the application.

mvn spring-boot:run

An error occurs.

2025-02-23T15:29:27.478+01:00 ERROR 33602 --- [MyAiCodeProjectPlanet] [           main] o.s.b.d.LoggingFailureAnalysisReporter   : 

***************************
APPLICATION FAILED TO START
***************************

Description:

Liquibase failed to start because no changelog could be found at 'classpath:/db/changelog/db.changelog-master.yaml'.

Action:

Make sure a Liquibase changelog is present at the configured path.

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  19.530 s
[INFO] Finished at: 2025-02-23T15:29:27+01:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:3.4.3:run (default-cli) on project myaicodeprojectplanet: Process terminated with exit code: 1 -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

This can be fixed by adding the following line into the application.properties file.

spring.liquibase.change-log=classpath:db/changelog/db.changelog-root.xml

Run the application and now it starts successfully.

11. Test Application

The application runs, but is it also functional?

11.1 Prompt

Open a new chat window and open the OpenAPI specification. Enter the prompt.

Generate some curl commands in order to test this openapi spec

11.2 Response

The response can be viewed here.

11.3 Run Tests

Create a Customer.

curl -X POST "http://localhost:8080/customers" -H "Content-Type: application/json" -d '{
  "firstName": "John",
  "lastName": "Doe"
}'
{"timestamp":"2025-02-23T14:33:11.903+00:00","status":404,"error":"Not Found","path":"/customers"}

This test fails. The cause is that the CustomersController has the following unnecessary annotation.

@RequestMapping("/customers")

This should not be here, this is already part of the CustomersApi interface.

Remove this line, build the application and run it again.

The changes can be viewed here.

Create a Customer. This is successful.

curl -X POST "http://localhost:8080/customers" -H "Content-Type: application/json" -d '{
  "firstName": "John",
  "lastName": "Doe"
}'

Retrieve a Customer. This is successful.

curl -X GET "http://localhost:8080/customers/1" -H "accept: application/json"
{"id":1,"firstName":"John","lastName":"Doe"}

Update a Customer. This is successful.

curl -X PUT "http://localhost:8080/customers/1" -H "Content-Type: application/json" -d '{
  "id": 1,
  "firstName": "Jane",
  "lastName": "Doe"
}'

Retrieve all Customers. This is successful.

curl -X GET "http://localhost:8080/customers" -H "accept: application/json"
[{"id":1,"firstName":"Jane","lastName":"Doe"}]

Delete a Customer. This is successful.

curl -X DELETE "http://localhost:8080/customers/1" -H "accept: application/json"

Retrieve all Customers. An empty list is returned. This is successful.

curl -X GET "http://localhost:8080/customers" -H "accept: application/json"
[]

12. Conclusion

It is possible to create a Spring Boot application from scratch using a local LLM. Creating the repository and controller needed some extra iterations and manual interventions. However, the result is quite good: the application is functional and meets the initial requirements.


Discover more from

Subscribe to get the latest posts sent to your email.