Sun, 13 July 2025
10 min read
In this first backend-focused post, I'll explore my tech stack choices, layered architecture decisions, and modern testing strategies with Docker Testcontainers.
Welcome to my second post in the digital planner series! While my first post explored the project’s vision and initial planning, this post dives deep into the technical implementation of the Spring Boot backend that powers my mental health and productivity app. I’ll walk you through my architectural decisions, technology choices, and the valuable lessons learned while building and testing my Spring backend.
While developing my mental health app, I also had a few ideas of potential improvements and found that there are plenty of features to implement; to help me keep track of these features, I have decided to use GitHub Projects. This allows me to use a more agile approach more effectively tracking and prioritising features that are essential to the MVP, and creating a backlog of features or fixes which I may need to implement. With the kanban board I can make each feature in its own card, assign effort values, and link any GitHub issues and PRs related to the feature, giving a comprehensive view of the project.
I created my Spring application using Spring Initializr, I configured the initial setup by selecting the Java version, Spring version, and build tool. I chose Gradle for its fast compilation and support for incremental builds, which helps reduce iteration time during development.
This choice allows me to focus more on implementing core application functionality rather than wasting time on complicated build configurations.
I chose PostgreSQL as my database as I have been interested in exploring it, and also for its JSON support and ACID compliance. To ensure consistency across development and testing environments, I containerised the database using Docker:
Development Environment: Uses the official PostgreSQL Docker image via Docker Compose, providing a consistent local development experience. I use environment variables in Docker to ensure that database usernames and passwords aren’t leaked, along with passing in the secret keys for Clerk authentication.
Integration Testing:, Utilises Testcontainers with a PostgreSQL docker image. This allows me to spin up isolated, ephemeral database instances that more closely replicate the production environment, enabling more reliable and reproducible tests. I’ll explore this setup in more detail in the testing section.
Developing a REST API in Spring boot requires organisation of code into distinct layers, each with their own responsibility. This principle is called separation of concerns. It ensures every part of the application has a clear purpose, keeping the code clean and maintainable. The main layers I’m using in my application are Entities, DTOs, Services, and Controllers.
While learning Spring, I read the book Spring getting started. It gave me a solid foundation in Spring from the core spring context to building and testing applications.
Entities serve as the contract between the PostgreSQL database and Java application, defining exactly how data is structured, stored, and retrieved. They represent the core data structures and their relationships while ensuring type safety and constraint enforcement across both the Java application and the database. The MoodEntry entity demonstrates a few important design decisions:
@Entity@Table(name = "mood_entry")@Getter@Setter@ToString@NoArgsConstructorpublic class MoodEntry { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Long id;
@Min(1) @Max(5) @NotNull @Column(nullable = false) private Short moodScore;
@NotNull @Column(nullable = false) private Instant dateTime;
@JdbcTypeCode(SqlTypes.JSON) @Column(name = "factors") private List<String> factors;
@Column(columnDefinition = "TEXT") private String notes;}
DTOs provide a stable interface between the API and clients, decoupling internal database entities from the API responses sent to frontend applications. A few benefits of using Data Transfer Objects are:
@Getter @Setter @NoArgsConstructor @ToStringpublic class MoodEntryResponseDTO { private Long id; private Short moodScore; private Instant dateTime; private List<String> factors; private String notes;}
@Getter @Setter @NoArgsConstructor @ToStringpublic class MoodEntryCreationDTO { @Min(1) @Max(5) @NotNull private Short moodScore;
@NotNull private Instant dateTime;
private List<String> factors;
private String notes;}
The repository layer handles data persistence and retrieval. Spring Data JPA makes this simple:
@Repositorypublic interface MoodEntryRepository extends JpaRepository<MoodEntry, Long> { // JpaRepository provides standard CRUD operations automatically}
The controller acts as the bridge between HTTP requests and the application’s business logic which is in the service layer. This layer receives requests from the frontend or clients through an HTTP request and ensures that the request is properly formatted, and passes it on to the correct service for processing.
Here is a snippet from my MoodEntryController showing how a new mood entry is created:
@RestController@RequestMapping("api/mood")public class MoodEntryController {
private final MoodEntryService moodEntryService;
public MoodEntryController(MoodEntryService moodEntryService) { this.moodEntryService = moodEntryService; }
@PostMapping public ResponseEntity<MoodEntryResponseDTO> createMoodEntry(@RequestBody @Valid MoodEntryCreationDTO creationDTO) { MoodEntryResponseDTO responseDTO = moodEntryService.createMoodEntry(creationDTO); return new ResponseEntity<>(responseDTO, HttpStatus.CREATED); }
Annotation Breakdown:
What This Method Does:
The service layer encapsulates all business logic and coordinates between the controller layer (which handles HTTP requests) and the data access layer (the repository that interacts with the database). It also manages the conversion between DTOs (Data Transfer Objects) used by the API and entities used by the database.
@Servicepublic class MoodEntryService {
private final MoodEntryRepository moodEntryRepository;
public MoodEntryService(MoodEntryRepository moodEntryRepository) { this.moodEntryRepository = moodEntryRepository; }
public MoodEntryResponseDTO createMoodEntry(MoodEntryCreationDTO creationDTO) { MoodEntry moodEntryToSave = toEntity(creationDTO); MoodEntry savedMoodEntry = moodEntryRepository.save(moodEntryToSave); return toDto(savedMoodEntry); }
// Mapping methods convert between DTOs and entities private MoodEntry toEntity(MoodEntryCreationDTO creationRequest) { MoodEntry entity = new MoodEntry(); entity.setMoodScore(creationRequest.getMoodScore()); entity.setDateTime(creationRequest.getDateTime()); entity.setFactors(creationRequest.getFactors()); entity.setNotes(creationRequest.getNotes()); return entity; }
private MoodEntryResponseDTO toDto(MoodEntry entity) { MoodEntryResponseDTO moodResponse = new MoodEntryResponseDTO(); moodResponse.setId(entity.getId()); moodResponse.setMoodScore(entity.getMoodScore()); moodResponse.setDateTime(entity.getDateTime()); moodResponse.setFactors(entity.getFactors()); moodResponse.setNotes(entity.getNotes()); return moodResponse; }
public MoodEntryResponseDTO createMoodEntry(MoodEntryCreationDTO creationDTO) { // TODO when adding multiple users get the authenticated user ID and set it on the entity before saving MoodEntry moodEntryToSave = toEntity(creationDTO); MoodEntry savedMoodEntry = moodEntryRepository.save(moodEntryToSave); return toDto(savedMoodEntry); }}
Current Features: The Spring application currently supports creating new mood entries along with full CRUD operations for creating, reading, updating, and deleting mood entries. Custom exception handling is implemented for resource not found scenarios, ensuring graceful error responses. User authentication is planned as a future enhancement to associate mood entries with individual users.
Architecture Benefits: This layered architecture separates each layer into a single responsibility: controllers handle HTTP requests, services manage business logic, and repositories handle data access. This improves maintainability and testability, while DTO mapping protects API contracts from database changes.
When integrating the application through fetch in my Next.js frontend, I encountered Cross-Origin Resource Sharing (CORS) issues, similar to what I had experienced in my cloud resume project. I was able to resolve this with the knowledge I had gained from the cloud resume challenge; the configuration below shows the HTTP methods my Spring app allows, along with OPTIONS as required. I have also added comments reminding me to change this configuration for production and checking security configurations at a later time.
@Configurationpublic class CorsConfig { @Bean public WebMvcConfigurer corsConfigurer() { return new WebMvcConfigurer() { @Override public void addCorsMappings(CorsRegistry registry) { registry.addMapping("/api/**") .allowedOrigins("http://localhost:3000") //change when moving to production as this is local. .allowedMethods("GET", "POST", "PUT", "DELETE", "OPTIONS") .allowedHeaders("*") //for security, look at limiting headers allowed. .allowCredentials(true); } }; }}
To ensure the reliability of a modern application, we need to test it before deploying it in production contexts; to help me do this, I have implemented a multi-layered testing strategy which includes unit tests and integration test which implement Testcontainers.
An example of unit testing is in the pomodoro session test when trying to get a session by ID. I test both the success and failure scenarios: if the session exists, it should return as expected, and when the session does not exist, it should return the error that the session was not found as intended. This approach ensures that both the happy path and error handling work correctly in isolation.
While implementing Testcontainers, I initially encountered an issue where one integration test would work in isolation, but when running multiple tests, later tests would fail due to the Docker container restarting with a new port during the test lifecycle. This seemed to be caused by using an abstract base class for test configuration, as the container instance and its dynamically assigned port weren’t being properly shared across test classes.
Luckily, I managed to discover a fix for this issue through a Jetbrains blog post Testing Spring Boot Applications Using Testcontainers; this blog post is a great read to learn more about test containers and spring boot.
@TestConfiguration(proxyBeanMethods = false)public class TestcontainersConfiguration { @Bean @ServiceConnection PostgreSQLContainer<?> postgresContainer() { return new PostgreSQLContainer<>(DockerImageName.parse("postgres:latest")); }}
This configuration class also has a few benefits from modern spring, first the @ServiceConnection
annotation will automatically configure the database connection properties, removing the need for manual property configuration. I can then import this configuration class into any of my integration tests using @Import(TestcontainersConfiguration.class)
, which provides a clean and reusable approach to using postgres with my integration tests and ensures consistent container lifecycle across all tests.
There’s still plenty of work to be done on the project I’ve got a few priorities: