-
Notifications
You must be signed in to change notification settings - Fork 1
feat: auction history 통계 테이블 생성 및 API 구현 #80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
✅ 테스트 결과 for PRBuild: success 🧪 테스트 실행 with Gradle |
Codecov Report❌ Patch coverage is 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR implements a comprehensive statistics system for auction history data, replacing the previous simple daily minimum price tracking with detailed daily and weekly statistics across three aggregation levels (items, subcategories, and top categories).
Key changes:
- Creates 6 new statistics tables (daily and weekly for items, subcategories, and top categories) with appropriate indexes and unique constraints
- Implements scheduled batch jobs for calculating statistics at 3 AM daily and 4 AM Monday weekly
- Provides REST APIs for querying all statistics types with pagination support
Reviewed changes
Copilot reviewed 64 out of 64 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| V13__create_statistics_tables.sql | Creates 6 statistics tables and removes old item_daily_min_price table |
| ItemDailyStatisticsRepository.java | Aggregates auction history into daily item statistics using native SQL |
| SubcategoryDailyStatisticsRepository.java | Aggregates item daily statistics into subcategory daily statistics |
| TopCategoryDailyStatisticsRepository.java | Aggregates subcategory daily statistics into top category daily statistics |
| ItemWeeklyStatisticsRepository.java | Aggregates daily item statistics into weekly statistics |
| SubcategoryWeeklyStatisticsRepository.java | Aggregates weekly item statistics into subcategory weekly statistics |
| TopCategoryWeeklyStatisticsRepository.java | Aggregates weekly subcategory statistics into top category weekly statistics |
| DailyStatisticsService.java | Orchestrates daily statistics calculation in 3 steps |
| WeeklyStatisticsService.java | Orchestrates weekly statistics calculation in 3 steps |
| DailyStatisticsScheduler.java | Schedules daily statistics calculation at 3 AM |
| WeeklyStatisticsScheduler.java | Schedules weekly statistics calculation at 4 AM Monday |
| Item/Subcategory/TopCategory Statistics Services | Provides query methods for each statistics type |
| Item/Subcategory/TopCategory Statistics Controllers | REST endpoints for accessing statistics with pagination |
| Statistics domain entities | JPA entities for all 6 statistics tables |
| Statistics mappers | MapStruct interfaces for entity-to-DTO conversion |
| Statistics response DTOs | Response objects with Swagger documentation |
| SecurityConfig.java | Adds /statistics/** endpoints to permitted paths |
| PageRequestDto.java | Fixes page calculation logic for null page values |
| application-test.yml | Adds test configuration for statistics cron expressions |
| docker-compose files | Updates port configuration and removes old min-price cron |
| Removed ItemDailyMinPrice files | Removes deprecated minimum price tracking code |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| FROM item_weekly_statistics iws | ||
| INNER JOIN auction_history ah ON iws.item_name = ah.item_name | ||
| WHERE iws.week_start_date = DATE_SUB(DATE_SUB(CURDATE(), INTERVAL WEEKDAY(CURDATE()) DAY), INTERVAL 7 DAY) | ||
| GROUP BY ah.item_sub_category, iws.year, iws.week_number, iws.week_start_date |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The JOIN condition in the weekly statistics query may produce incorrect results. The query joins item_weekly_statistics with auction_history on item_name, but auction_history contains many rows for each item across different dates. This will cause duplicate rows and incorrect aggregations. The query should either use a DISTINCT subquery to get unique sub_category to item_name mappings, or create a separate mapping table to avoid the cartesian product.
| FROM subcategory_daily_statistics sds | ||
| INNER JOIN auction_history ah ON sds.item_sub_category = ah.item_sub_category | ||
| AND DATE(ah.date_auction_buy) = sds.date_auction_buy | ||
| WHERE sds.date_auction_buy = DATE(DATE_SUB(NOW(), INTERVAL 9 HOUR)) | ||
| GROUP BY ah.item_top_category, sds.date_auction_buy |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The JOIN condition in the daily statistics query may produce incorrect results. The query joins subcategory_daily_statistics with auction_history on both item_sub_category and date_auction_buy, but this will create duplicate rows if multiple items exist in the same subcategory on the same date. This will cause the SUM aggregations to be multiplied incorrectly. The query should use a DISTINCT subquery to get the unique top_category for each sub_category, rather than joining with the full auction_history table.
| FROM item_daily_statistics ids | ||
| INNER JOIN auction_history ah ON ids.item_name = ah.item_name | ||
| AND DATE(ah.date_auction_buy) = ids.date_auction_buy | ||
| WHERE ids.date_auction_buy = DATE(DATE_SUB(NOW(), INTERVAL 9 HOUR)) | ||
| GROUP BY ah.item_sub_category, ids.date_auction_buy |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The JOIN condition in the daily statistics query may produce incorrect results. The query joins item_daily_statistics with auction_history on both item_name and date_auction_buy, but this will create duplicate rows if the same item has multiple transactions on the same date. This will cause the SUM aggregations to be multiplied incorrectly. The query should use a DISTINCT subquery to get the unique sub_category for each item_name, rather than joining with the full auction_history table.
| FROM subcategory_weekly_statistics sws | ||
| INNER JOIN auction_history ah ON sws.item_sub_category = ah.item_sub_category | ||
| WHERE sws.week_start_date = DATE_SUB(DATE_SUB(CURDATE(), INTERVAL WEEKDAY(CURDATE()) DAY), INTERVAL 7 DAY) | ||
| GROUP BY ah.item_top_category, sws.year, sws.week_number, sws.week_start_date |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The JOIN condition in the weekly statistics query may produce incorrect results. The query joins subcategory_weekly_statistics with auction_history on item_sub_category, but auction_history contains many rows for each subcategory across different dates. This will cause duplicate rows and incorrect aggregations. The query should either use a DISTINCT subquery to get unique top_category to sub_category mappings, or create a separate mapping table to avoid the cartesian product.
| - "${SERVER_PORT}:${SERVER_PORT}" | ||
| env_file: | ||
| - .env | ||
| - .env: |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a syntax error in the env_file configuration. It should be "- .env" (a list item), not "- .env:" (with a colon). The colon at the end makes this invalid YAML syntax and will cause Docker Compose to fail when parsing this file.
| - .env: | |
| - .env |
|
|
||
| public Pageable toPageable() { | ||
| int resolvedPage = this.page != null ? this.page - 1 : DEFAULT_PAGE; | ||
| int resolvedPage = this.page != null ? this.page - 1 : DEFAULT_PAGE - 1; |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The page calculation logic is incorrect. When page is null, it should default to 0 (since Spring's PageRequest uses 0-based indexing), but this code calculates "DEFAULT_PAGE - 1" which equals "1 - 1 = 0". However, when page is provided (e.g., page=1), it calculates "page - 1 = 0", which is correct. The issue is that DEFAULT_PAGE is 1, but the default resolved page should be 0. Consider changing DEFAULT_PAGE to 0, or simplify the logic to: "int resolvedPage = this.page != null ? this.page - 1 : 0;"
| env_file: | ||
| - .env | ||
| - .env: | ||
|
|
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's an unnecessary blank line after the env_file configuration. This empty line (line 14) should be removed to maintain consistent YAML formatting.
| private final DailyStatisticsService dailyStatisticsService; | ||
|
|
||
| /** 매일 새벽 일간 통계 계산 및 저장 기본 cron: 매일 새벽 3시 (변경 가능) */ | ||
| @Scheduled(cron = "${statistics.daily.cron:5 0 3 * * *}", zone = "Asia/Seoul") |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The cron expression has an extra "5" at the beginning which makes it invalid. The cron format is "second minute hour day month weekday", so the expression should be "0 0 3 * * *" (runs at 3:00 AM daily), not "5 0 3 * * *" which would try to run at 3:00:05 AM but may not be parsed correctly by Spring's cron parser.
| private final WeeklyStatisticsService weeklyStatisticsService; | ||
|
|
||
| /** 매주 월요일 새벽 주간 통계 계산 및 저장 (전주 데이터 집계) 기본 cron: 매주 월요일 새벽 4시 (변경 가능) */ | ||
| @Scheduled(cron = "${statistics.weekly.cron:5 0 4 * * MON}", zone = "Asia/Seoul") |
Copilot
AI
Dec 26, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The cron expression has an extra "5" at the beginning which makes it invalid. The cron format is "second minute hour day month weekday", so the expression should be "0 0 4 * * MON" (runs at 4:00 AM on Mondays), not "5 0 4 * * MON" which would try to run at 4:00:05 AM but may not be parsed correctly by Spring's cron parser.
📋 상세 설명
📊 체크리스트
이슈없이진행