# dotnet-interview-exercise This exercise is designed to test your skills in C# and .NET. ## Technologies used in this exercise * .NET 8.0 / C# * Minimal ASP.NET Core Web API * Entity Framework Core * PostgreSQL Database (Docker) * HttpClient for downstream REST API calls * ??? ## Prerequisites * Latest [.NET 8 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) * Visual Studio Code * Docker Desktop and docker compose plugin * **Disable Copilot completions** (that would make it too easy...) * Be ready for sharing your screen in Zoom * Launch settings have been configured to run and debug the web application ## Exercise ### Phase 1 Note: For all API and schemas, see . * Implement the missing code in `JsonPlaceholderClient.cs` * Setup of the client. * Method for fetching one post from the external API at `https://jsonplaceholder.typicode.com/posts/{id}`. * In our endpoint handler in `Program.cs`, fetch the post with the ID provided. * Store the post received in the database using the `Post` entity, but leave out `UpdatedAt` for now. * Use for testing. ```mermaid sequenceDiagram title Overview actor B as Swagger UI participant S as Service participant E as External Service participant D as Database B->>S: GetPostById S<<->>E: GetPostById S->>D: Store Post S->>B: Return Post ``` **Success Criteria: Verify that the post has been saved with the database with:** ```bash $ docker compose exec --env PGPASSWORD=password db psql -h localhost -U user --db mydatabase -c "select * from posts;" id | user_id | title | body | updated_at ----+---------+----------------------------------------------------------------------------+-----------------------------------------------------+------------------------ 1 | 1 | sunt aut facere repellat provident occaecati excepturi optio reprehenderit | quia et suscipit +| 1970-01-01 00:00:00+00 | | | suscipit recusandae consequuntur expedita et cum +| | | | reprehenderit molestiae ut ut quas totam +| | | | nostrum rerum est autem sunt rem eveniet architecto | (1 row) ``` ### Phase 2 * Check whether we already have the post in the database. * If we do, return the post from the database, but without the internal `updated_at` timestamp. * If we don't, fetch the post from the external API and store it in the database, and return it. * **Success Criteria: Check the service logs to see if the post was read from the database, or whether the client made a call to the downstream service.** ### Phase 3 * Set the `UpdatedAt` property of the `Post` entity to the current date and time when storing it in the database. * If the post in the database has been updated less than 1 minute ago, return the one from the database. * If the post in the database is older than 1 minute (extra short value for testing), fetch the post from the external API again and update it in the database before returning. * **Success Criteria: Check the service logs to see if the post was read from the database, or whether the client made a call to the downstream service.** ### Phase 4 * Think about how we could improve resiliency in downstream API calls. * For setup, merge in the changes from the `phase4` branch. * It adds chaos-injecting code in `Program.cs` to simulate random failures in downstream API calls. * Execute a couple of requests with increasing ids until an error happens. * Implement basic resiliency measures to handle these kinds of failures. * **Success Criteria: Check the service logs to see the resiliency measures in action.** ### Phase 5 * Now increase the chaos outcomes rate of 503 to 95% to simulate a partial outage. What is the outcome on our endpoint? * Improve the error response returned by handling the downstream error accordingly. * **Success Criteria: Service should return the appropriate error response that warrants retrying the call from the client side.** ### Bonus Phase 6 (discussion only) * The current DB schema solution has big drawbacks. What are they? * How would you implement schema migrations that avoid these issues? * What do you need to consider in the setup for proper schema migrations? ### Bonus Phase 7 (discussion only) * Imagine the following deployment scenario: * The service is deployed to and running in the Elastic Container Service (ECS) in Amazon AWS * The downstream service requires authentication * How would you store and retrieve the required credentials? * What options do you see to also require authentication on our little service? ## Troubleshooting ### Start database manually Open a terminal at the repo root and execute ```bash $ docker compose up -d db [+] Running 3/3 ✔ Network dotnet-interview-exercise_default Created 0.0s ✔ Volume "dotnet-interview-exercise_db_data" Created 0.0s ✔ Container dotnet-interview-exercise-db-1 Started 0.1s ``` ### Cleanup database In case something went wrong, wipe and rebuild the database ```bash $ docker compose down -v [+] Running 3/3 ✔ Container dotnet-interview-exercise-db-1 Removed. 0.2s ✔ Network dotnet-interview-exercise_default Removed. 0.2s ✔ Volume dotnet-interview-exercise_db_data Removed. 0.0s $ docker compose up -d db [+] Running 3/3 ✔ Network dotnet-interview-exercise_default Created 0.0s ✔ Volume "dotnet-interview-exercise_db_data" Created 0.0s ✔ Container dotnet-interview-exercise-db-1 Started 0.1s ```