A Django REST API for managing property listings and bookings. This application provides endpoints for creating and managing property listings, handling bookings, and managing reviews.
- Property Listings Management
- Booking System
- Review System
- User Authentication
- API Documentation with Swagger/ReDoc
- MySQL Database Integration
- Celery Task Queue Integration
- Database Seeding for Development
- Python 3.x
- Django 4.x
- Django REST Framework
- MySQL
- Redis (for Celery)
- Swagger/ReDoc for API documentation
- Python 3.x
- MySQL
- Redis
- Docker (optional)
- Clone the repository:
git clone <repository-url>
cd alx_travel_app
- Create a virtual environment and activate it:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Set up MySQL database:
# Using Docker
docker run --name alx_travel_mysql \
-e MYSQL_ROOT_PASSWORD=rootpassword \
-e MYSQL_DATABASE=alx_travel_db \
-e MYSQL_USER=alx_travel_user \
-e MYSQL_PASSWORD=alx_travel_pass \
-p 3306:3306 -d mysql:latest
- Create a
.env
file in the project root (use.env.example
as a template):
cp .env.example .env
# Edit .env with your configuration
- Run migrations:
python manage.py migrate
- Create a superuser:
python manage.py createsuperuser
- (Optional) Seed the database with sample data:
python manage.py seed
This will create:
- 7 sample users (5 regular users, 2 staff users)
- 14 property listings (2 per user)
- Multiple bookings and reviews
All sample users have the password:
password123
- Start the development server:
python manage.py runserver
- Start Celery worker (in a separate terminal):
celery -A alx_travel_app worker -l info
The API provides the following endpoints:
/api/listings/
- Property listings management/api/bookings/
- Booking management/api/reviews/
- Review management/swagger/
- Swagger API documentation/redoc/
- ReDoc API documentation/admin/
- Admin interface/api-auth/
- Authentication endpoints
The API uses Django REST Framework's built-in authentication. To access protected endpoints:
- Create a user account or use the superuser account
- Use the login endpoint or session authentication
- Include authentication credentials in your requests
The API is fully documented using Swagger/OpenAPI specification. You can explore and test the API endpoints using the interactive documentation.
- Swagger UI:
/swagger/
- ReDoc:
/redoc/
- Make sure to create and activate a virtual environment
- Install development dependencies
- Follow PEP 8 style guide
- Write tests for new features
- Update documentation as needed
Run the test suite:
python manage.py test
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
This project uses Celery with RabbitMQ for handling background tasks such as sending email notifications.
- Install RabbitMQ:
sudo apt-get install rabbitmq-server
- Start RabbitMQ service:
sudo systemctl start rabbitmq-server
sudo systemctl enable rabbitmq-server
- Start the Celery worker:
celery -A alx_travel_app worker -l info
- (Optional) Start Celery beat for periodic tasks:
celery -A alx_travel_app beat -l info
The application uses SMTP for sending emails. To configure email settings:
- Copy the email configuration from
.env.example
to your.env
file - Update the email settings with your SMTP credentials
- For Gmail, you'll need to:
- Enable 2-factor authentication
- Generate an App Password
- Use the App Password as EMAIL_HOST_PASSWORD
- Asynchronous email notifications for booking confirmations
- Background task processing with Celery
- Email notifications using SMTP
You can run the entire application stack using Docker Compose:
- Create a
.env
file from the template:
cp .env.example .env
# Edit .env with your configuration
- Build and start the containers:
docker-compose up --build
This will start the following services:
- Django web application (http://localhost:8000)
- MySQL database
- Redis for Celery results backend
- RabbitMQ for message broker
- Celery worker for background tasks
You can access:
- RabbitMQ management interface at http://localhost:15672 (guest/guest)
- Django admin interface at http://localhost:8000/admin
- API documentation at http://localhost:8000/swagger/
- Mailpit web interface at http://localhost:8025
- Run migrations inside the container:
docker-compose exec web python manage.py migrate
- Create a superuser:
docker-compose exec web python manage.py createsuperuser
- (Optional) Seed the database:
docker-compose exec web python manage.py seed
To stop the services:
docker-compose down
To stop the services and remove all data (volumes):
docker-compose down -v
The application uses Mailpit for email testing in the development environment. When running with Docker Compose:
- SMTP server is available at port 1025
- Web interface to view emails is available at http://localhost:8025
- No authentication required for SMTP
- All emails are caught by Mailpit and won't be actually sent
- You can view HTML and plain text versions of emails
- Supports attachments and different email clients
You can access:
- RabbitMQ management interface at http://localhost:15672 (guest/guest)
- Django admin interface at http://localhost:8000/admin
- API documentation at http://localhost:8000/swagger/
- Mailpit web interface at http://localhost:8025
- Docker and Docker Compose installed on your system
- Git installed on your system
- Clone the repository:
git clone <repository-url>
cd alx_travel_app_0x03
- Create a
.env
file in the project root with the following environment variables:
# Django Settings
DEBUG=True
SECRET_KEY=your-secret-key
ALLOWED_HOSTS=localhost,127.0.0.1
# Database Settings
MYSQL_ROOT_PASSWORD=rootpassword
MYSQL_DATABASE=alx_travel_app
MYSQL_USER=alx_travel_app_user
MYSQL_PASSWORD=password123
MYSQL_HOST=db
MYSQL_PORT=3306
# RabbitMQ Settings
RABBITMQ_DEFAULT_USER=guest
RABBITMQ_DEFAULT_PASS=guest
CELERY_BROKER_URL=amqp://guest:guest@rabbitmq:5672/
CELERY_RESULT_BACKEND=redis://redis:6379/0
# Email Settings (Using Mailpit for local development)
EMAIL_HOST=mailpit
EMAIL_PORT=1025
EMAIL_USE_TLS=False
DEFAULT_FROM_EMAIL=test@alxtravelapp.com
- Build and start the Docker containers:
docker compose up --build
- Once all services are running, in a new terminal, run the database migrations:
docker compose exec web python manage.py migrate
- Create a superuser account:
docker compose exec web python manage.py createsuperuser
Follow the prompts to create your admin account.
After completing the setup, you can access the following services:
- Django Application: http://localhost:8000
- Django Admin Interface: http://localhost:8000/admin
- API Documentation: http://localhost:8000/api/docs/
- RabbitMQ Management: http://localhost:15672 (guest/guest)
- Mailpit Web Interface: http://localhost:8025 (for viewing sent emails)
The project uses Mailpit for local email testing. All emails sent by the application in development will be caught by Mailpit and can be viewed in its web interface at http://localhost:8025. No emails will actually be sent to real email addresses.
- Stop all services:
docker compose down
- View logs of all services:
docker compose logs
- View logs of a specific service:
docker compose logs [service_name] # e.g., docker compose logs web
- Restart a specific service:
docker compose restart [service_name] # e.g., docker compose restart web
- Run Django management commands:
docker compose exec web python manage.py [command]
-
Database Connection Issues:
- Ensure the MySQL container is running:
docker compose ps
- Check MySQL logs:
docker compose logs db
- Verify database credentials in
.env
file
- Ensure the MySQL container is running:
-
Email Testing Issues:
- Ensure Mailpit container is running
- Check Mailpit logs:
docker compose logs mailpit
- Verify email settings in Django settings
-
Container Start-up Issues:
- Remove all containers and volumes:
docker compose down -v
- Rebuild all containers:
docker compose up --build
- Remove all containers and volumes:
- Make changes to your code
- The Django development server will automatically reload
- If you modify dependencies:
- Update
requirements.txt
- Rebuild containers:
docker compose up --build
- Update
- If you modify models:
- Create migrations:
docker compose exec web python manage.py makemigrations
- Apply migrations:
docker compose exec web python manage.py migrate
- Create migrations:
The application uses Chapa for payment processing. Here's how to test the payment flow:
First, create a payment for a booking using the /api/payments/
endpoint:
{
"booking": "YOUR_BOOKING_ID",
"amount": "1250.00",
"currency": "ETB",
"email": "user@example.com",
"phone_number": "+251911234567",
"first_name": "Test",
"last_name": "User",
"payment_title": "Booking Payment",
"description": "Payment for 5 nights stay"
}
After creating the payment, initialize it using:
POST /api/payments/{payment_id}/initialize/
The response will include a checkout_url
and the system will automatically send an email to the user with:
- A link to complete the payment
- Booking details
- Payment amount
- Expiry information
You can check the email in Mailpit (http://localhost:8025):
You can complete the payment in two ways:
- Click the payment link in your email
- Use the
checkout_url
from the API response
Both will take you to the Chapa checkout interface:
- Click the payment link in your email
- Use the
checkout_url
from the API response
Both will take you to the Chapa checkout interface:
After successful payment, you'll be redirected to the success page:
You can verify the payment status in two ways:
- Check the payment details:
GET /api/payments/{payment_id}/
- Explicitly verify with Chapa:
GET /api/payments/{payment_id}/verify/
The verification response will show the updated payment status:
After successful verification, you can check the payment details to confirm the status has been updated to "completed":
The system sends several types of email notifications throughout the booking and payment process:
- Booking Confirmation Email
- Sent automatically when a booking is created
- Contains booking details and reference number
- Payment Checkout Email
- Sent when a payment is initialized
- Contains the payment link and booking details
- Includes amount to be paid and expiry information
- Payment Confirmation Email
- Sent after successful payment verification
- Contains transaction details and booking confirmation
You can view all emails in the Mailpit interface at http://localhost:8025
- Create a new payment:
curl -X POST http://localhost:8000/api/payments/ \
-H "Content-Type: application/json" \
-d '{
"booking": "YOUR_BOOKING_ID",
"amount": "1250.00",
"currency": "ETB",
"email": "test@example.com",
"phone_number": "+251911234567",
"first_name": "Test",
"last_name": "User",
"payment_title": "Booking Payment",
"description": "Test payment"
}'
- Initialize the payment:
curl -X POST http://localhost:8000/api/payments/{payment_id}/initialize/
-
Complete the payment using the test card:
- Open the checkout URL in your browser
- Use test card: 4242424242424242
- Any future expiry date
- Any 3-digit CVC
- Any 4-digit PIN
-
Verify the payment:
curl -X GET http://localhost:8000/api/payments/{payment_id}/verify/
- Check the results:
- Payment status should be "completed"
- Booking status should be "confirmed"
- Check Mailpit (http://localhost:8025) for confirmation email
- View payment details in Django admin (http://localhost:8000/admin/listings/payment/)
-
Payment Initialization Fails:
- Check your Chapa API key in
.env
- Ensure all required fields are provided
- Check the error message in the response
- Check your Chapa API key in
-
Email Not Received:
- Ensure Celery worker is running:
docker compose logs celery
- Check Mailpit is running:
docker compose ps
- View Mailpit logs:
docker compose logs mailpit
- Ensure Celery worker is running:
-
Payment Status Not Updating:
- Check Chapa webhook logs
- Try manual verification using the verify endpoint
- Check Django logs:
docker compose logs web
Note: This is using Chapa's test mode. For production, you'll need to:
- Create a Chapa business account
- Get production API keys
- Update the
CHAPA_SECRET_KEY
in your environment variables