25 次代碼提交 b0aa1c2b28 ... a387a4487b

作者 SHA1 備註 提交日期
  maxfeng a387a4487b refactor(MAPatternStrategy): 注释掉OP交易品种的配置 2 周之前
  maxfeng 20b1c2f7b7 feat(AGENTS): 添加技能系统文档,列出可用技能及其描述 4 周之前
  maxfeng 00ada5da83 feat(MAPatternStrategy): 增强均线聚合度处理与动态保证金调整 1 月之前
  maxfeng 14e5316522 refactor(MAPatternStrategy): 移除冗余的保证金校验逻辑 1 月之前
  maxfeng ed21be3563 feat(MAPatternStrategy): 更新开仓逻辑以支持单手保证金限制检查 1 月之前
  maxfeng 1dbcdcd5e4 feat(MAPatternStrategy): 增强开仓后保证金校验与记录功能(但是这个版本更新的功能是错的,计算保证金的方法是错的) 1 月之前
  maxfeng 0be3a65f82 fix(MAPatternStrategy): 修正计算最近平均变化的逻辑 1 月之前
  maxfeng 3ef7c84902 feat(MAPatternStrategy): 增强均线形态交易策略,新增MA60及开仓记录快照功能 1 月之前
  maxfeng 881ec2b1f3 feat(records_analysis): 添加期货开仓记录分析工具 1 月之前
  maxfeng b086e6cc55 feat(MAPatternStrategy): 增强均线形态交易策略功能 1 月之前
  maxfeng 3955341360 fix(MAPatternStrategy): 更新策略品种选择和持仓检查逻辑 1 月之前
  maxfeng 188f738355 feat(training): 增加用户开仓信心指数输入与处理 1 月之前
  maxfeng bc1a23f456 feat(MAPatternStrategy): 增加开盘价差过滤功能并重构相关逻辑 1 月之前
  maxfeng 30e54075f6 1. 修改了partial和full的图片名称 1 月之前
  maxfeng 6c4f45d12a 优化交易训练工具的绘图功能:根据交易方向调整当天K线颜色,改进文本框位置和连接线绘制逻辑,确保信息展示更清晰。同时,更新随机选择交易的逻辑,以合约类型分组随机抽取,避免同类交易连续出现。 1 月之前
  maxfeng c3db52931b 1. 优化结果分析工具: 1 月之前
  maxfeng 39b9aff47c 1. 新增交易训练工具:辅助顺势交易。 1 月之前
  maxfeng f6b2c91188 新增连续交易对识别功能,提取标的核心字母并为交易对分配连续交易对ID,增强统计信息打印,包含连续交易对的数量和涉及的交易记录。更新保存结果逻辑以调整列顺序。 1 月之前
  maxfeng 1e9e7e0761 优化MAPatternStrategy_v002.py,简化开仓条件检查逻辑,仅依赖均线穿越得分,更新得分计算函数以返回得分详情,增强日志信息以便于调试。 1 月之前
  maxfeng 958d8b8ed0 更新MAPatternStrategy_v002.py,优化换月交易记录更新逻辑,确保成本价和入场时间的准确性。 1 月之前
  maxfeng 0a849ca10e 优化均线穿越得分检查逻辑,盘中要求更高,收盘前要求更低。 1 月之前
  maxfeng 3ff6dc4444 为顺势交易添加了订单状态的检查,如果为new判定非交易时间,则禁止同一天的所有交易。这是为了避免在节假日之前无夜盘时段进行交易 2 月之前
  maxfeng 0c27add474 修复了顺势交易里的保证金计算的方式 2 月之前
  maxfeng 942d5c1967 1. MAPatternStrategy_v001.py的策略完成,25年测试效果很好 2 月之前
  maxfeng 4f7de0c45f 优化K线重建工具'kline_reconstruction.py',增强CSV文件读取的编码兼容性,支持多种日期格式解析,改进交易日计算逻辑,更新绘图功能以标注成交价和委托时间,同时新增目录打包功能以便于输出管理。 2 月之前

+ 96 - 0
.spec-workflow/templates/design-template.md

@@ -0,0 +1,96 @@
+# Design Document
+
+## Overview
+
+[High-level description of the feature and its place in the overall system]
+
+## Steering Document Alignment
+
+### Technical Standards (tech.md)
+[How the design follows documented technical patterns and standards]
+
+### Project Structure (structure.md)
+[How the implementation will follow project organization conventions]
+
+## Code Reuse Analysis
+[What existing code will be leveraged, extended, or integrated with this feature]
+
+### Existing Components to Leverage
+- **[Component/Utility Name]**: [How it will be used]
+- **[Service/Helper Name]**: [How it will be extended]
+
+### Integration Points
+- **[Existing System/API]**: [How the new feature will integrate]
+- **[Database/Storage]**: [How data will connect to existing schemas]
+
+## Architecture
+
+[Describe the overall architecture and design patterns used]
+
+### Modular Design Principles
+- **Single File Responsibility**: Each file should handle one specific concern or domain
+- **Component Isolation**: Create small, focused components rather than large monolithic files
+- **Service Layer Separation**: Separate data access, business logic, and presentation layers
+- **Utility Modularity**: Break utilities into focused, single-purpose modules
+
+```mermaid
+graph TD
+    A[Component A] --> B[Component B]
+    B --> C[Component C]
+```
+
+## Components and Interfaces
+
+### Component 1
+- **Purpose:** [What this component does]
+- **Interfaces:** [Public methods/APIs]
+- **Dependencies:** [What it depends on]
+- **Reuses:** [Existing components/utilities it builds upon]
+
+### Component 2
+- **Purpose:** [What this component does]
+- **Interfaces:** [Public methods/APIs]
+- **Dependencies:** [What it depends on]
+- **Reuses:** [Existing components/utilities it builds upon]
+
+## Data Models
+
+### Model 1
+```
+[Define the structure of Model1 in your language]
+- id: [unique identifier type]
+- name: [string/text type]
+- [Additional properties as needed]
+```
+
+### Model 2
+```
+[Define the structure of Model2 in your language]
+- id: [unique identifier type]
+- [Additional properties as needed]
+```
+
+## Error Handling
+
+### Error Scenarios
+1. **Scenario 1:** [Description]
+   - **Handling:** [How to handle]
+   - **User Impact:** [What user sees]
+
+2. **Scenario 2:** [Description]
+   - **Handling:** [How to handle]
+   - **User Impact:** [What user sees]
+
+## Testing Strategy
+
+### Unit Testing
+- [Unit testing approach]
+- [Key components to test]
+
+### Integration Testing
+- [Integration testing approach]
+- [Key flows to test]
+
+### End-to-End Testing
+- [E2E testing approach]
+- [User scenarios to test]

+ 51 - 0
.spec-workflow/templates/product-template.md

@@ -0,0 +1,51 @@
+# Product Overview
+
+## Product Purpose
+[Describe the core purpose of this product/project. What problem does it solve?]
+
+## Target Users
+[Who are the primary users of this product? What are their needs and pain points?]
+
+## Key Features
+[List the main features that deliver value to users]
+
+1. **Feature 1**: [Description]
+2. **Feature 2**: [Description]
+3. **Feature 3**: [Description]
+
+## Business Objectives
+[What are the business goals this product aims to achieve?]
+
+- [Objective 1]
+- [Objective 2]
+- [Objective 3]
+
+## Success Metrics
+[How will we measure the success of this product?]
+
+- [Metric 1]: [Target]
+- [Metric 2]: [Target]
+- [Metric 3]: [Target]
+
+## Product Principles
+[Core principles that guide product decisions]
+
+1. **[Principle 1]**: [Explanation]
+2. **[Principle 2]**: [Explanation]
+3. **[Principle 3]**: [Explanation]
+
+## Monitoring & Visibility (if applicable)
+[How do users track progress and monitor the system?]
+
+- **Dashboard Type**: [e.g., Web-based, CLI, Desktop app]
+- **Real-time Updates**: [e.g., WebSocket, polling, push notifications]
+- **Key Metrics Displayed**: [What information is most important to surface]
+- **Sharing Capabilities**: [e.g., read-only links, exports, reports]
+
+## Future Vision
+[Where do we see this product evolving in the future?]
+
+### Potential Enhancements
+- **Remote Access**: [e.g., Tunnel features for sharing dashboards with stakeholders]
+- **Analytics**: [e.g., Historical trends, performance metrics]
+- **Collaboration**: [e.g., Multi-user support, commenting]

+ 50 - 0
.spec-workflow/templates/requirements-template.md

@@ -0,0 +1,50 @@
+# Requirements Document
+
+## Introduction
+
+[Provide a brief overview of the feature, its purpose, and its value to users]
+
+## Alignment with Product Vision
+
+[Explain how this feature supports the goals outlined in product.md]
+
+## Requirements
+
+### Requirement 1
+
+**User Story:** As a [role], I want [feature], so that [benefit]
+
+#### Acceptance Criteria
+
+1. WHEN [event] THEN [system] SHALL [response]
+2. IF [precondition] THEN [system] SHALL [response]
+3. WHEN [event] AND [condition] THEN [system] SHALL [response]
+
+### Requirement 2
+
+**User Story:** As a [role], I want [feature], so that [benefit]
+
+#### Acceptance Criteria
+
+1. WHEN [event] THEN [system] SHALL [response]
+2. IF [precondition] THEN [system] SHALL [response]
+
+## Non-Functional Requirements
+
+### Code Architecture and Modularity
+- **Single Responsibility Principle**: Each file should have a single, well-defined purpose
+- **Modular Design**: Components, utilities, and services should be isolated and reusable
+- **Dependency Management**: Minimize interdependencies between modules
+- **Clear Interfaces**: Define clean contracts between components and layers
+
+### Performance
+- [Performance requirements]
+
+### Security
+- [Security requirements]
+
+### Reliability
+- [Reliability requirements]
+
+### Usability
+- [Usability requirements]

+ 145 - 0
.spec-workflow/templates/structure-template.md

@@ -0,0 +1,145 @@
+# Project Structure
+
+## Directory Organization
+
+```
+[Define your project's directory structure. Examples below - adapt to your project type]
+
+Example for a library/package:
+project-root/
+├── src/                    # Source code
+├── tests/                  # Test files  
+├── docs/                   # Documentation
+├── examples/               # Usage examples
+└── [build/dist/out]        # Build output
+
+Example for an application:
+project-root/
+├── [src/app/lib]           # Main source code
+├── [assets/resources]      # Static resources
+├── [config/settings]       # Configuration
+├── [scripts/tools]         # Build/utility scripts
+└── [tests/spec]            # Test files
+
+Common patterns:
+- Group by feature/module
+- Group by layer (UI, business logic, data)
+- Group by type (models, controllers, views)
+- Flat structure for simple projects
+```
+
+## Naming Conventions
+
+### Files
+- **Components/Modules**: [e.g., `PascalCase`, `snake_case`, `kebab-case`]
+- **Services/Handlers**: [e.g., `UserService`, `user_service`, `user-service`]
+- **Utilities/Helpers**: [e.g., `dateUtils`, `date_utils`, `date-utils`]
+- **Tests**: [e.g., `[filename]_test`, `[filename].test`, `[filename]Test`]
+
+### Code
+- **Classes/Types**: [e.g., `PascalCase`, `CamelCase`, `snake_case`]
+- **Functions/Methods**: [e.g., `camelCase`, `snake_case`, `PascalCase`]
+- **Constants**: [e.g., `UPPER_SNAKE_CASE`, `SCREAMING_CASE`, `PascalCase`]
+- **Variables**: [e.g., `camelCase`, `snake_case`, `lowercase`]
+
+## Import Patterns
+
+### Import Order
+1. External dependencies
+2. Internal modules
+3. Relative imports
+4. Style imports
+
+### Module/Package Organization
+```
+[Describe your project's import/include patterns]
+Examples:
+- Absolute imports from project root
+- Relative imports within modules
+- Package/namespace organization
+- Dependency management approach
+```
+
+## Code Structure Patterns
+
+[Define common patterns for organizing code within files. Below are examples - choose what applies to your project]
+
+### Module/Class Organization
+```
+Example patterns:
+1. Imports/includes/dependencies
+2. Constants and configuration
+3. Type/interface definitions
+4. Main implementation
+5. Helper/utility functions
+6. Exports/public API
+```
+
+### Function/Method Organization
+```
+Example patterns:
+- Input validation first
+- Core logic in the middle
+- Error handling throughout
+- Clear return points
+```
+
+### File Organization Principles
+```
+Choose what works for your project:
+- One class/module per file
+- Related functionality grouped together
+- Public API at the top/bottom
+- Implementation details hidden
+```
+
+## Code Organization Principles
+
+1. **Single Responsibility**: Each file should have one clear purpose
+2. **Modularity**: Code should be organized into reusable modules
+3. **Testability**: Structure code to be easily testable
+4. **Consistency**: Follow patterns established in the codebase
+
+## Module Boundaries
+[Define how different parts of your project interact and maintain separation of concerns]
+
+Examples of boundary patterns:
+- **Core vs Plugins**: Core functionality vs extensible plugins
+- **Public API vs Internal**: What's exposed vs implementation details  
+- **Platform-specific vs Cross-platform**: OS-specific code isolation
+- **Stable vs Experimental**: Production code vs experimental features
+- **Dependencies direction**: Which modules can depend on which
+
+## Code Size Guidelines
+[Define your project's guidelines for file and function sizes]
+
+Suggested guidelines:
+- **File size**: [Define maximum lines per file]
+- **Function/Method size**: [Define maximum lines per function]
+- **Class/Module complexity**: [Define complexity limits]
+- **Nesting depth**: [Maximum nesting levels]
+
+## Dashboard/Monitoring Structure (if applicable)
+[How dashboard or monitoring components are organized]
+
+### Example Structure:
+```
+src/
+└── dashboard/          # Self-contained dashboard subsystem
+    ├── server/        # Backend server components
+    ├── client/        # Frontend assets
+    ├── shared/        # Shared types/utilities
+    └── public/        # Static assets
+```
+
+### Separation of Concerns
+- Dashboard isolated from core business logic
+- Own CLI entry point for independent operation
+- Minimal dependencies on main application
+- Can be disabled without affecting core functionality
+
+## Documentation Standards
+- All public APIs must have documentation
+- Complex logic should include inline comments
+- README files for major modules
+- Follow language-specific documentation conventions

+ 139 - 0
.spec-workflow/templates/tasks-template.md

@@ -0,0 +1,139 @@
+# Tasks Document
+
+- [ ] 1. Create core interfaces in src/types/feature.ts
+  - File: src/types/feature.ts
+  - Define TypeScript interfaces for feature data structures
+  - Extend existing base interfaces from base.ts
+  - Purpose: Establish type safety for feature implementation
+  - _Leverage: src/types/base.ts_
+  - _Requirements: 1.1_
+  - _Prompt: Role: TypeScript Developer specializing in type systems and interfaces | Task: Create comprehensive TypeScript interfaces for the feature data structures following requirements 1.1, extending existing base interfaces from src/types/base.ts | Restrictions: Do not modify existing base interfaces, maintain backward compatibility, follow project naming conventions | Success: All interfaces compile without errors, proper inheritance from base types, full type coverage for feature requirements_
+
+- [ ] 2. Create base model class in src/models/FeatureModel.ts
+  - File: src/models/FeatureModel.ts
+  - Implement base model extending BaseModel class
+  - Add validation methods using existing validation utilities
+  - Purpose: Provide data layer foundation for feature
+  - _Leverage: src/models/BaseModel.ts, src/utils/validation.ts_
+  - _Requirements: 2.1_
+  - _Prompt: Role: Backend Developer with expertise in Node.js and data modeling | Task: Create a base model class extending BaseModel and implementing validation following requirement 2.1, leveraging existing patterns from src/models/BaseModel.ts and src/utils/validation.ts | Restrictions: Must follow existing model patterns, do not bypass validation utilities, maintain consistent error handling | Success: Model extends BaseModel correctly, validation methods implemented and tested, follows project architecture patterns_
+
+- [ ] 3. Add specific model methods to FeatureModel.ts
+  - File: src/models/FeatureModel.ts (continue from task 2)
+  - Implement create, update, delete methods
+  - Add relationship handling for foreign keys
+  - Purpose: Complete model functionality for CRUD operations
+  - _Leverage: src/models/BaseModel.ts_
+  - _Requirements: 2.2, 2.3_
+  - _Prompt: Role: Backend Developer with expertise in ORM and database operations | Task: Implement CRUD methods and relationship handling in FeatureModel.ts following requirements 2.2 and 2.3, extending patterns from src/models/BaseModel.ts | Restrictions: Must maintain transaction integrity, follow existing relationship patterns, do not duplicate base model functionality | Success: All CRUD operations work correctly, relationships are properly handled, database operations are atomic and efficient_
+
+- [ ] 4. Create model unit tests in tests/models/FeatureModel.test.ts
+  - File: tests/models/FeatureModel.test.ts
+  - Write tests for model validation and CRUD methods
+  - Use existing test utilities and fixtures
+  - Purpose: Ensure model reliability and catch regressions
+  - _Leverage: tests/helpers/testUtils.ts, tests/fixtures/data.ts_
+  - _Requirements: 2.1, 2.2_
+  - _Prompt: Role: QA Engineer with expertise in unit testing and Jest/Mocha frameworks | Task: Create comprehensive unit tests for FeatureModel validation and CRUD methods covering requirements 2.1 and 2.2, using existing test utilities from tests/helpers/testUtils.ts and fixtures from tests/fixtures/data.ts | Restrictions: Must test both success and failure scenarios, do not test external dependencies directly, maintain test isolation | Success: All model methods are tested with good coverage, edge cases covered, tests run independently and consistently_
+
+- [ ] 5. Create service interface in src/services/IFeatureService.ts
+  - File: src/services/IFeatureService.ts
+  - Define service contract with method signatures
+  - Extend base service interface patterns
+  - Purpose: Establish service layer contract for dependency injection
+  - _Leverage: src/services/IBaseService.ts_
+  - _Requirements: 3.1_
+  - _Prompt: Role: Software Architect specializing in service-oriented architecture and TypeScript interfaces | Task: Design service interface contract following requirement 3.1, extending base service patterns from src/services/IBaseService.ts for dependency injection | Restrictions: Must maintain interface segregation principle, do not expose internal implementation details, ensure contract compatibility with DI container | Success: Interface is well-defined with clear method signatures, extends base service appropriately, supports all required service operations_
+
+- [ ] 6. Implement feature service in src/services/FeatureService.ts
+  - File: src/services/FeatureService.ts
+  - Create concrete service implementation using FeatureModel
+  - Add error handling with existing error utilities
+  - Purpose: Provide business logic layer for feature operations
+  - _Leverage: src/services/BaseService.ts, src/utils/errorHandler.ts, src/models/FeatureModel.ts_
+  - _Requirements: 3.2_
+  - _Prompt: Role: Backend Developer with expertise in service layer architecture and business logic | Task: Implement concrete FeatureService following requirement 3.2, using FeatureModel and extending BaseService patterns with proper error handling from src/utils/errorHandler.ts | Restrictions: Must implement interface contract exactly, do not bypass model validation, maintain separation of concerns from data layer | Success: Service implements all interface methods correctly, robust error handling implemented, business logic is well-encapsulated and testable_
+
+- [ ] 7. Add service dependency injection in src/utils/di.ts
+  - File: src/utils/di.ts (modify existing)
+  - Register FeatureService in dependency injection container
+  - Configure service lifetime and dependencies
+  - Purpose: Enable service injection throughout application
+  - _Leverage: existing DI configuration in src/utils/di.ts_
+  - _Requirements: 3.1_
+  - _Prompt: Role: DevOps Engineer with expertise in dependency injection and IoC containers | Task: Register FeatureService in DI container following requirement 3.1, configuring appropriate lifetime and dependencies using existing patterns from src/utils/di.ts | Restrictions: Must follow existing DI container patterns, do not create circular dependencies, maintain service resolution efficiency | Success: FeatureService is properly registered and resolvable, dependencies are correctly configured, service lifetime is appropriate for use case_
+
+- [ ] 8. Create service unit tests in tests/services/FeatureService.test.ts
+  - File: tests/services/FeatureService.test.ts
+  - Write tests for service methods with mocked dependencies
+  - Test error handling scenarios
+  - Purpose: Ensure service reliability and proper error handling
+  - _Leverage: tests/helpers/testUtils.ts, tests/mocks/modelMocks.ts_
+  - _Requirements: 3.2, 3.3_
+  - _Prompt: Role: QA Engineer with expertise in service testing and mocking frameworks | Task: Create comprehensive unit tests for FeatureService methods covering requirements 3.2 and 3.3, using mocked dependencies from tests/mocks/modelMocks.ts and test utilities | Restrictions: Must mock all external dependencies, test business logic in isolation, do not test framework code | Success: All service methods tested with proper mocking, error scenarios covered, tests verify business logic correctness and error handling_
+
+- [ ] 4. Create API endpoints
+  - Design API structure
+  - _Leverage: src/api/baseApi.ts, src/utils/apiUtils.ts_
+  - _Requirements: 4.0_
+  - _Prompt: Role: API Architect specializing in RESTful design and Express.js | Task: Design comprehensive API structure following requirement 4.0, leveraging existing patterns from src/api/baseApi.ts and utilities from src/utils/apiUtils.ts | Restrictions: Must follow REST conventions, maintain API versioning compatibility, do not expose internal data structures directly | Success: API structure is well-designed and documented, follows existing patterns, supports all required operations with proper HTTP methods and status codes_
+
+- [ ] 4.1 Set up routing and middleware
+  - Configure application routes
+  - Add authentication middleware
+  - Set up error handling middleware
+  - _Leverage: src/middleware/auth.ts, src/middleware/errorHandler.ts_
+  - _Requirements: 4.1_
+  - _Prompt: Role: Backend Developer with expertise in Express.js middleware and routing | Task: Configure application routes and middleware following requirement 4.1, integrating authentication from src/middleware/auth.ts and error handling from src/middleware/errorHandler.ts | Restrictions: Must maintain middleware order, do not bypass security middleware, ensure proper error propagation | Success: Routes are properly configured with correct middleware chain, authentication works correctly, errors are handled gracefully throughout the request lifecycle_
+
+- [ ] 4.2 Implement CRUD endpoints
+  - Create API endpoints
+  - Add request validation
+  - Write API integration tests
+  - _Leverage: src/controllers/BaseController.ts, src/utils/validation.ts_
+  - _Requirements: 4.2, 4.3_
+  - _Prompt: Role: Full-stack Developer with expertise in API development and validation | Task: Implement CRUD endpoints following requirements 4.2 and 4.3, extending BaseController patterns and using validation utilities from src/utils/validation.ts | Restrictions: Must validate all inputs, follow existing controller patterns, ensure proper HTTP status codes and responses | Success: All CRUD operations work correctly, request validation prevents invalid data, integration tests pass and cover all endpoints_
+
+- [ ] 5. Add frontend components
+  - Plan component architecture
+  - _Leverage: src/components/BaseComponent.tsx, src/styles/theme.ts_
+  - _Requirements: 5.0_
+  - _Prompt: Role: Frontend Architect with expertise in React component design and architecture | Task: Plan comprehensive component architecture following requirement 5.0, leveraging base patterns from src/components/BaseComponent.tsx and theme system from src/styles/theme.ts | Restrictions: Must follow existing component patterns, maintain design system consistency, ensure component reusability | Success: Architecture is well-planned and documented, components are properly organized, follows existing patterns and theme system_
+
+- [ ] 5.1 Create base UI components
+  - Set up component structure
+  - Implement reusable components
+  - Add styling and theming
+  - _Leverage: src/components/BaseComponent.tsx, src/styles/theme.ts_
+  - _Requirements: 5.1_
+  - _Prompt: Role: Frontend Developer specializing in React and component architecture | Task: Create reusable UI components following requirement 5.1, extending BaseComponent patterns and using existing theme system from src/styles/theme.ts | Restrictions: Must use existing theme variables, follow component composition patterns, ensure accessibility compliance | Success: Components are reusable and properly themed, follow existing architecture, accessible and responsive_
+
+- [ ] 5.2 Implement feature-specific components
+  - Create feature components
+  - Add state management
+  - Connect to API endpoints
+  - _Leverage: src/hooks/useApi.ts, src/components/BaseComponent.tsx_
+  - _Requirements: 5.2, 5.3_
+  - _Prompt: Role: React Developer with expertise in state management and API integration | Task: Implement feature-specific components following requirements 5.2 and 5.3, using API hooks from src/hooks/useApi.ts and extending BaseComponent patterns | Restrictions: Must use existing state management patterns, handle loading and error states properly, maintain component performance | Success: Components are fully functional with proper state management, API integration works smoothly, user experience is responsive and intuitive_
+
+- [ ] 6. Integration and testing
+  - Plan integration approach
+  - _Leverage: src/utils/integrationUtils.ts, tests/helpers/testUtils.ts_
+  - _Requirements: 6.0_
+  - _Prompt: Role: Integration Engineer with expertise in system integration and testing strategies | Task: Plan comprehensive integration approach following requirement 6.0, leveraging integration utilities from src/utils/integrationUtils.ts and test helpers | Restrictions: Must consider all system components, ensure proper test coverage, maintain integration test reliability | Success: Integration plan is comprehensive and feasible, all system components work together correctly, integration points are well-tested_
+
+- [ ] 6.1 Write end-to-end tests
+  - Set up E2E testing framework
+  - Write user journey tests
+  - Add test automation
+  - _Leverage: tests/helpers/testUtils.ts, tests/fixtures/data.ts_
+  - _Requirements: All_
+  - _Prompt: Role: QA Automation Engineer with expertise in E2E testing and test frameworks like Cypress or Playwright | Task: Implement comprehensive end-to-end tests covering all requirements, setting up testing framework and user journey tests using test utilities and fixtures | Restrictions: Must test real user workflows, ensure tests are maintainable and reliable, do not test implementation details | Success: E2E tests cover all critical user journeys, tests run reliably in CI/CD pipeline, user experience is validated from end-to-end_
+
+- [ ] 6.2 Final integration and cleanup
+  - Integrate all components
+  - Fix any integration issues
+  - Clean up code and documentation
+  - _Leverage: src/utils/cleanup.ts, docs/templates/_
+  - _Requirements: All_
+  - _Prompt: Role: Senior Developer with expertise in code quality and system integration | Task: Complete final integration of all components and perform comprehensive cleanup covering all requirements, using cleanup utilities and documentation templates | Restrictions: Must not break existing functionality, ensure code quality standards are met, maintain documentation consistency | Success: All components are fully integrated and working together, code is clean and well-documented, system meets all requirements and quality standards_

+ 99 - 0
.spec-workflow/templates/tech-template.md

@@ -0,0 +1,99 @@
+# Technology Stack
+
+## Project Type
+[Describe what kind of project this is: web application, CLI tool, desktop application, mobile app, library, API service, embedded system, game, etc.]
+
+## Core Technologies
+
+### Primary Language(s)
+- **Language**: [e.g., Python 3.11, Go 1.21, TypeScript, Rust, C++]
+- **Runtime/Compiler**: [if applicable]
+- **Language-specific tools**: [package managers, build tools, etc.]
+
+### Key Dependencies/Libraries
+[List the main libraries and frameworks your project depends on]
+- **[Library/Framework name]**: [Purpose and version]
+- **[Library/Framework name]**: [Purpose and version]
+
+### Application Architecture
+[Describe how your application is structured - this could be MVC, event-driven, plugin-based, client-server, standalone, microservices, monolithic, etc.]
+
+### Data Storage (if applicable)
+- **Primary storage**: [e.g., PostgreSQL, files, in-memory, cloud storage]
+- **Caching**: [e.g., Redis, in-memory, disk cache]
+- **Data formats**: [e.g., JSON, Protocol Buffers, XML, binary]
+
+### External Integrations (if applicable)
+- **APIs**: [External services you integrate with]
+- **Protocols**: [e.g., HTTP/REST, gRPC, WebSocket, TCP/IP]
+- **Authentication**: [e.g., OAuth, API keys, certificates]
+
+### Monitoring & Dashboard Technologies (if applicable)
+- **Dashboard Framework**: [e.g., React, Vue, vanilla JS, terminal UI]
+- **Real-time Communication**: [e.g., WebSocket, Server-Sent Events, polling]
+- **Visualization Libraries**: [e.g., Chart.js, D3, terminal graphs]
+- **State Management**: [e.g., Redux, Vuex, file system as source of truth]
+
+## Development Environment
+
+### Build & Development Tools
+- **Build System**: [e.g., Make, CMake, Gradle, npm scripts, cargo]
+- **Package Management**: [e.g., pip, npm, cargo, go mod, apt, brew]
+- **Development workflow**: [e.g., hot reload, watch mode, REPL]
+
+### Code Quality Tools
+- **Static Analysis**: [Tools for code quality and correctness]
+- **Formatting**: [Code style enforcement tools]
+- **Testing Framework**: [Unit, integration, and/or end-to-end testing tools]
+- **Documentation**: [Documentation generation tools]
+
+### Version Control & Collaboration
+- **VCS**: [e.g., Git, Mercurial, SVN]
+- **Branching Strategy**: [e.g., Git Flow, GitHub Flow, trunk-based]
+- **Code Review Process**: [How code reviews are conducted]
+
+### Dashboard Development (if applicable)
+- **Live Reload**: [e.g., Hot module replacement, file watchers]
+- **Port Management**: [e.g., Dynamic allocation, configurable ports]
+- **Multi-Instance Support**: [e.g., Running multiple dashboards simultaneously]
+
+## Deployment & Distribution (if applicable)
+- **Target Platform(s)**: [Where/how the project runs: cloud, on-premise, desktop, mobile, embedded]
+- **Distribution Method**: [How users get your software: download, package manager, app store, SaaS]
+- **Installation Requirements**: [Prerequisites, system requirements]
+- **Update Mechanism**: [How updates are delivered]
+
+## Technical Requirements & Constraints
+
+### Performance Requirements
+- [e.g., response time, throughput, memory usage, startup time]
+- [Specific benchmarks or targets]
+
+### Compatibility Requirements  
+- **Platform Support**: [Operating systems, architectures, versions]
+- **Dependency Versions**: [Minimum/maximum versions of dependencies]
+- **Standards Compliance**: [Industry standards, protocols, specifications]
+
+### Security & Compliance
+- **Security Requirements**: [Authentication, encryption, data protection]
+- **Compliance Standards**: [GDPR, HIPAA, SOC2, etc. if applicable]
+- **Threat Model**: [Key security considerations]
+
+### Scalability & Reliability
+- **Expected Load**: [Users, requests, data volume]
+- **Availability Requirements**: [Uptime targets, disaster recovery]
+- **Growth Projections**: [How the system needs to scale]
+
+## Technical Decisions & Rationale
+[Document key architectural and technology choices]
+
+### Decision Log
+1. **[Technology/Pattern Choice]**: [Why this was chosen, alternatives considered]
+2. **[Architecture Decision]**: [Rationale, trade-offs accepted]
+3. **[Tool/Library Selection]**: [Reasoning, evaluation criteria]
+
+## Known Limitations
+[Document any technical debt, limitations, or areas for improvement]
+
+- [Limitation 1]: [Impact and potential future solutions]
+- [Limitation 2]: [Why it exists and when it might be addressed]

+ 64 - 0
.spec-workflow/user-templates/README.md

@@ -0,0 +1,64 @@
+# User Templates
+
+This directory allows you to create custom templates that override the default Spec Workflow templates.
+
+## How to Use Custom Templates
+
+1. **Create your custom template file** in this directory with the exact same name as the default template you want to override:
+   - `requirements-template.md` - Override requirements document template
+   - `design-template.md` - Override design document template
+   - `tasks-template.md` - Override tasks document template
+   - `product-template.md` - Override product steering template
+   - `tech-template.md` - Override tech steering template
+   - `structure-template.md` - Override structure steering template
+
+2. **Template Loading Priority**:
+   - The system first checks this `user-templates/` directory
+   - If a matching template is found here, it will be used
+   - Otherwise, the default template from `templates/` will be used
+
+## Example Custom Template
+
+To create a custom requirements template:
+
+1. Create a file named `requirements-template.md` in this directory
+2. Add your custom structure, for example:
+
+```markdown
+# Requirements Document
+
+## Executive Summary
+[Your custom section]
+
+## Business Requirements
+[Your custom structure]
+
+## Technical Requirements
+[Your custom fields]
+
+## Custom Sections
+[Add any sections specific to your workflow]
+```
+
+## Template Variables
+
+Templates can include placeholders that will be replaced when documents are created:
+- `{{projectName}}` - The name of your project
+- `{{featureName}}` - The name of the feature being specified
+- `{{date}}` - The current date
+- `{{author}}` - The document author
+
+## Best Practices
+
+1. **Start from defaults**: Copy a default template from `../templates/` as a starting point
+2. **Keep structure consistent**: Maintain similar section headers for tool compatibility
+3. **Document changes**: Add comments explaining why sections were added/modified
+4. **Version control**: Track your custom templates in version control
+5. **Test thoroughly**: Ensure custom templates work with the spec workflow tools
+
+## Notes
+
+- Custom templates are project-specific and not included in the package distribution
+- The `templates/` directory contains the default templates which are updated with each version
+- Your custom templates in this directory are preserved during updates
+- If a custom template has errors, the system will fall back to the default template

+ 129 - 0
AGENTS.md

@@ -0,0 +1,129 @@
+# AGENTS
+
+<skills_system priority="1">
+
+## Available Skills
+
+<!-- SKILLS_TABLE_START -->
+<usage>
+When users ask you to perform tasks, check if any of the available skills below can help complete the task more effectively. Skills provide specialized capabilities and domain knowledge.
+
+How to use skills:
+- Invoke: Bash("openskills read <skill-name>")
+- The skill content will load with detailed instructions on how to complete the task
+- Base directory provided in output for resolving bundled resources (references/, scripts/, assets/)
+
+Usage notes:
+- Only use skills listed in <available_skills> below
+- Do not invoke a skill that is already loaded in your context
+- Each skill invocation is stateless
+</usage>
+
+<available_skills>
+
+<skill>
+<name>algorithmic-art</name>
+<description>Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>brand-guidelines</name>
+<description>Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>canvas-design</name>
+<description>Create beautiful visual art in .png and .pdf documents using design philosophy. You should use this skill when the user asks to create a poster, piece of art, design, or other static piece. Create original visual designs, never copying existing artists' work to avoid copyright violations.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>doc-coauthoring</name>
+<description>Guide users through a structured workflow for co-authoring documentation. Use when user wants to write documentation, proposals, technical specs, decision docs, or similar structured content. This workflow helps users efficiently transfer context, refine content through iteration, and verify the doc works for readers. Trigger when user mentions writing docs, creating proposals, drafting specs, or similar documentation tasks.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>docx</name>
+<description>"Comprehensive document creation, editing, and analysis with support for tracked changes, comments, formatting preservation, and text extraction. When Claude needs to work with professional documents (.docx files) for: (1) Creating new documents, (2) Modifying or editing content, (3) Working with tracked changes, (4) Adding comments, or any other document tasks"</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>frontend-design</name>
+<description>Create distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, artifacts, posters, or applications (examples include websites, landing pages, dashboards, React components, HTML/CSS layouts, or when styling/beautifying any web UI). Generates creative, polished code and UI design that avoids generic AI aesthetics.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>internal-comms</name>
+<description>A set of resources to help me write all kinds of internal communications, using the formats that my company likes to use. Claude should use this skill whenever asked to write some sort of internal communications (status reports, leadership updates, 3P updates, company newsletters, FAQs, incident reports, project updates, etc.).</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>mcp-builder</name>
+<description>Guide for creating high-quality MCP (Model Context Protocol) servers that enable LLMs to interact with external services through well-designed tools. Use when building MCP servers to integrate external APIs or services, whether in Python (FastMCP) or Node/TypeScript (MCP SDK).</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>pdf</name>
+<description>Comprehensive PDF manipulation toolkit for extracting text and tables, creating new PDFs, merging/splitting documents, and handling forms. When Claude needs to fill in a PDF form or programmatically process, generate, or analyze PDF documents at scale.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>pptx</name>
+<description>"Presentation creation, editing, and analysis. When Claude needs to work with presentations (.pptx files) for: (1) Creating new presentations, (2) Modifying or editing content, (3) Working with layouts, (4) Adding comments or speaker notes, or any other presentation tasks"</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>skill-creator</name>
+<description>Guide for creating effective skills. This skill should be used when users want to create a new skill (or update an existing skill) that extends Claude's capabilities with specialized knowledge, workflows, or tool integrations.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>slack-gif-creator</name>
+<description>Knowledge and utilities for creating animated GIFs optimized for Slack. Provides constraints, validation tools, and animation concepts. Use when users request animated GIFs for Slack like "make me a GIF of X doing Y for Slack."</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>template</name>
+<description>Replace with description of the skill and when Claude should use it.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>theme-factory</name>
+<description>Toolkit for styling artifacts with a theme. These artifacts can be slides, docs, reportings, HTML landing pages, etc. There are 10 pre-set themes with colors/fonts that you can apply to any artifact that has been creating, or can generate a new theme on-the-fly.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>web-artifacts-builder</name>
+<description>Suite of tools for creating elaborate, multi-component claude.ai HTML artifacts using modern frontend web technologies (React, Tailwind CSS, shadcn/ui). Use for complex artifacts requiring state management, routing, or shadcn/ui components - not for simple single-file HTML/JSX artifacts.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>webapp-testing</name>
+<description>Toolkit for interacting with and testing local web applications using Playwright. Supports verifying frontend functionality, debugging UI behavior, capturing browser screenshots, and viewing browser logs.</description>
+<location>global</location>
+</skill>
+
+<skill>
+<name>xlsx</name>
+<description>"Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work with spreadsheets (.xlsx, .xlsm, .csv, .tsv, etc) for: (1) Creating new spreadsheets with formulas and formatting, (2) Reading or analyzing data, (3) Modify existing spreadsheets while preserving formulas, (4) Data analysis and visualization in spreadsheets, or (5) Recalculating formulas"</description>
+<location>global</location>
+</skill>
+
+</available_skills>
+<!-- SKILLS_TABLE_END -->
+
+</skills_system>

File diff suppressed because it is too large
+ 445 - 206
Lib/future/MAPatternStrategy_v001.py


+ 0 - 1127
Lib/future/MAPatternStrategy_v001.py.bak

@@ -1,1127 +0,0 @@
-# 导入函数库
-from jqdata import *
-from jqdata import finance
-import pandas as pd
-import numpy as np
-from datetime import date, datetime, timedelta, time
-import re
-
-# 顺势交易策略 v001
-# 基于均线走势(前提条件)+ K线形态(开盘价差、当天价差)的期货交易策略
-#
-# 核心逻辑:
-# 1. 开盘时检查均线走势(MA30<=MA20<=MA10<=MA5为多头,反之为空头)
-# 2. 检查开盘价差是否符合方向要求(多头>=0.5%,空头<=-0.5%)
-# 3. 14:35和14:55检查当天价差(多头>0,空头<0),满足条件则开仓
-# 4. 应用固定止损和动态追踪止盈
-# 5. 自动换月移仓
-
-# 设置以便完整打印 DataFrame
-pd.set_option('display.max_rows', None)
-pd.set_option('display.max_columns', None)
-pd.set_option('display.width', None)
-pd.set_option('display.max_colwidth', 20)
-
-## 初始化函数,设定基准等等
-def initialize(context):
-    # 设定沪深300作为基准
-    set_benchmark('000300.XSHG')
-    # 开启动态复权模式(真实价格)
-    set_option('use_real_price', True)
-    # 输出内容到日志
-    log.info('=' * 60)
-    log.info('均线形态交易策略 v001 初始化开始')
-    log.info('策略类型: 均线走势 + K线形态')
-    log.info('=' * 60)
-
-    ### 期货相关设定 ###
-    # 设定账户为金融账户
-    set_subportfolios([SubPortfolioConfig(cash=context.portfolio.starting_cash, type='index_futures')])
-    # 期货类每笔交易时的手续费是: 买入时万分之0.23,卖出时万分之0.23,平今仓为万分之23
-    set_order_cost(OrderCost(open_commission=0.000023, close_commission=0.000023, close_today_commission=0.0023), type='index_futures')
-    
-    # 设置期货交易的滑点
-    set_slippage(StepRelatedSlippage(2))
-    
-    # 初始化全局变量
-    g.usage_percentage = 0.8  # 最大资金使用比例
-    g.max_margin_per_position = 20000  # 单个标的最大持仓保证金(元)
-    
-    # 均线策略参数
-    g.ma_periods = [5, 10, 20, 30]  # 均线周期
-    g.ma_historical_days = 60  # 获取历史数据天数(确保足够计算MA30)
-    g.ma_open_gap_threshold = 0.002  # 方案1开盘价差阈值(0.2%)
-    g.ma_pattern_lookback_days = 10  # 历史均线模式一致性检查的天数
-    g.ma_pattern_consistency_threshold = 0.8  # 历史均线模式一致性阈值(80%)
-    g.check_intraday_spread = False  # 是否检查日内价差(True: 检查, False: 跳过)
-    g.ma_proximity_min_threshold = 8  # MA5与MA10贴近计数和的最低阈值
-    
-    # 均线价差策略方案选择
-    g.ma_gap_strategy_mode = 2  # 策略模式选择(1: 原方案, 2: 新方案)
-    g.ma_open_gap_threshold2 = 0.002  # 方案2开盘价差阈值(0.2%)
-    g.ma_intraday_threshold_scheme2 = 0.005  # 方案2日内变化阈值(0.5%)
-    
-    # 止损止盈策略参数
-    g.fixed_stop_loss_rate = 0.01  # 固定止损比率(1%)
-    g.ma_offset_ratio_normal = 0.003  # 均线跟踪止盈常规偏移量(0.3%)
-    g.ma_offset_ratio_close = 0.01  # 均线跟踪止盈收盘前偏移量(1%)
-    g.days_for_adjustment = 4  # 持仓天数调整阈值
-    
-    # 输出策略参数
-    log.info("均线形态策略参数:")
-    log.info(f"  均线周期: {g.ma_periods}")
-    log.info(f"  策略模式: 方案{g.ma_gap_strategy_mode}")
-    log.info(f"  方案1开盘价差阈值: {g.ma_open_gap_threshold:.1%}")
-    log.info(f"  方案2开盘价差阈值: {g.ma_open_gap_threshold2:.1%}")
-    log.info(f"  方案2日内变化阈值: {g.ma_intraday_threshold_scheme2:.1%}")
-    log.info(f"  历史均线模式检查天数: {g.ma_pattern_lookback_days}天")
-    log.info(f"  历史均线模式一致性阈值: {g.ma_pattern_consistency_threshold:.1%}")
-    log.info(f"  均线贴近计数阈值: {g.ma_proximity_min_threshold}")
-    log.info(f"  是否检查日内价差: {g.check_intraday_spread}")
-    log.info(f"  固定止损: {g.fixed_stop_loss_rate:.1%}")
-    log.info(f"  均线跟踪止盈常规偏移: {g.ma_offset_ratio_normal:.1%}")
-    log.info(f"  均线跟踪止盈收盘前偏移: {g.ma_offset_ratio_close:.1%}")
-    log.info(f"  持仓天数调整阈值: {g.days_for_adjustment}天")
-    
-    # 期货品种完整配置字典
-    g.futures_config = {
-        # 贵金属
-        'AU': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 1000, 'trading_start_time': '21:00'},
-        'AG': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 15, 'trading_start_time': '21:00'},
-        
-        # 有色金属
-        'CU': {'has_night_session': True, 'margin_rate': {'long': 0.09, 'short': 0.09}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'AL': {'has_night_session': True, 'margin_rate': {'long': 0.09, 'short': 0.09}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'ZN': {'has_night_session': True, 'margin_rate': {'long': 0.09, 'short': 0.09}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'PB': {'has_night_session': True, 'margin_rate': {'long': 0.09, 'short': 0.09}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'NI': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 1, 'trading_start_time': '21:00'},
-        'SN': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 1, 'trading_start_time': '21:00'},
-        'SS': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        
-        # 黑色系
-        'RB': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'HC': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'I': {'has_night_session': True, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 100, 'trading_start_time': '21:00'},
-        'JM': {'has_night_session': True, 'margin_rate': {'long': 0.22, 'short': 0.22}, 'multiplier': 100, 'trading_start_time': '21:00'},
-        'J': {'has_night_session': True, 'margin_rate': {'long': 0.22, 'short': 0.22}, 'multiplier': 60, 'trading_start_time': '21:00'},
-        
-        # 能源化工
-        'SP': {'has_night_session': True, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'FU': {'has_night_session': True, 'margin_rate': {'long': 0.08, 'short': 0.08}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'BU': {'has_night_session': True, 'margin_rate': {'long': 0.04, 'short': 0.04}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'RU': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'BR': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'SC': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 1000, 'trading_start_time': '21:00'},
-        'NR': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'LU': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'LC': {'has_night_session': False, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 1, 'trading_start_time': '09:00'},
-        
-        # 化工
-        'FG': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 20, 'trading_start_time': '21:00'},
-        'TA': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'MA': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'SA': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 20, 'trading_start_time': '21:00'},
-        'L': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'V': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'EG': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'PP': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'EB': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'PG': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 20, 'trading_start_time': '21:00'},
-        
-        # 农产品
-        'RM': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'OI': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'CF': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'SR': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'PF': {'has_night_session': True, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'C': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'CS': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'CY': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
-        'A': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'B': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'M': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'Y': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        'P': {'has_night_session': True, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 10, 'trading_start_time': '21:00'},
-        
-        # 无夜盘品种
-        'IF': {'has_night_session': False, 'margin_rate': {'long': 0.08, 'short': 0.08}, 'multiplier': 300, 'trading_start_time': '09:30'},
-        'IH': {'has_night_session': False, 'margin_rate': {'long': 0.08, 'short': 0.08}, 'multiplier': 300, 'trading_start_time': '09:30'},
-        'IC': {'has_night_session': False, 'margin_rate': {'long': 0.08, 'short': 0.08}, 'multiplier': 200, 'trading_start_time': '09:30'},
-        'IM': {'has_night_session': False, 'margin_rate': {'long': 0.08, 'short': 0.08}, 'multiplier': 200, 'trading_start_time': '09:30'},
-        'AP': {'has_night_session': False, 'margin_rate': {'long': 0.08, 'short': 0.08}, 'multiplier': 10, 'trading_start_time': '09:00'},
-        'CJ': {'has_night_session': False, 'margin_rate': {'long': 0.09, 'short': 0.09}, 'multiplier': 5, 'trading_start_time': '09:00'},
-        'PK': {'has_night_session': False, 'margin_rate': {'long': 0.05, 'short': 0.05}, 'multiplier': 5, 'trading_start_time': '09:00'},
-        'JD': {'has_night_session': False, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 10, 'trading_start_time': '09:00'},
-        'LH': {'has_night_session': False, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 16, 'trading_start_time': '09:00'}
-    }
-    
-    # 策略品种选择策略配置
-    # 方案1:全品种策略 - 考虑所有配置的期货品种
-    g.strategy_focus_symbols = ['IC', 'LH']  # 空列表表示考虑所有品种
-    
-    # 方案2:精选品种策略 - 只交易流动性较好的特定品种(如需使用请取消下行注释)
-    # g.strategy_focus_symbols = ['RM', 'CJ', 'CY', 'JD', 'L', 'LC', 'SF', 'SI']
-    
-    log.info(f"品种选择策略: {'全品种策略(覆盖所有配置品种)' if not g.strategy_focus_symbols else '精选品种策略(' + str(len(g.strategy_focus_symbols)) + '个品种)'}")
-    
-    # 交易记录和数据存储
-    g.trade_history = {}  # 持仓记录 {symbol: {'entry_price': xxx, 'direction': xxx, ...}}
-    g.daily_ma_candidates = {}  # 通过均线和开盘价差检查的候选品种 {symbol: {'direction': 'long'/'short', 'open_price': xxx, ...}}
-    g.today_trades = []  # 当日交易记录
-    g.excluded_contracts = {}  # 每日排除的合约缓存 {dominant_future: {'reason': 'ma_trend'/'open_gap', 'trading_day': xxx}}
-    g.ma_checked_underlyings = {}  # 记录各品种在交易日的均线检查状态 {symbol: trading_day}
-    g.last_ma_trading_day = None  # 最近一次均线检查所属交易日
-    
-    # 定时任务设置
-    # 夜盘开始(21:05) - 均线和开盘价差检查
-    run_daily(check_ma_trend_and_open_gap, time='21:05:00', reference_security='IF1808.CCFX')
-    
-    # 日盘开始 - 均线和开盘价差检查
-    run_daily(check_ma_trend_and_open_gap, time='09:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_ma_trend_and_open_gap, time='09:35:00', reference_security='IF1808.CCFX')
-    
-    # 盘中价差检查和开仓(14:35和14:55)
-    run_daily(check_intraday_price_diff, time='14:35:00', reference_security='IF1808.CCFX')
-    run_daily(check_intraday_price_diff, time='14:55:00', reference_security='IF1808.CCFX')
-    
-    # 夜盘止损止盈检查
-    run_daily(check_stop_loss_profit, time='21:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='21:35:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='22:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='22:35:00', reference_security='IF1808.CCFX')
-    
-    # 日盘止损止盈检查
-    run_daily(check_stop_loss_profit, time='09:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='09:35:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='10:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='10:35:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='11:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='11:25:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='13:35:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='14:05:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='14:35:00', reference_security='IF1808.CCFX')
-    run_daily(check_stop_loss_profit, time='14:55:00', reference_security='IF1808.CCFX')
-    
-    # 收盘后
-    run_daily(after_market_close, time='15:30:00', reference_security='IF1808.CCFX')
-    
-    log.info('=' * 60)
-
-############################ 主程序执行函数 ###################################
-
-def get_current_trading_day(current_dt):
-    """根据当前时间推断对应的期货交易日"""
-    current_date = current_dt.date()
-    current_time = current_dt.time()
-
-    trade_days = get_trade_days(end_date=current_date, count=1)
-    if trade_days and trade_days[0] == current_date:
-        trading_day = current_date
-    else:
-        next_days = get_trade_days(start_date=current_date, count=1)
-        trading_day = next_days[0] if next_days else current_date
-
-    if current_time >= time(20, 59):
-        next_trade_days = get_trade_days(start_date=trading_day, count=2)
-        if len(next_trade_days) >= 2:
-            return next_trade_days[1]
-        if len(next_trade_days) == 1:
-            return next_trade_days[0]
-    return trading_day
-
-
-def normalize_trade_day_value(value):
-    """将交易日对象统一转换为 datetime.date"""
-    if isinstance(value, date) and not isinstance(value, datetime):
-        return value
-    if isinstance(value, datetime):
-        return value.date()
-    if hasattr(value, 'to_pydatetime'):
-        return value.to_pydatetime().date()
-    try:
-        return pd.Timestamp(value).date()
-    except Exception:
-        return value
-
-
-def check_ma_trend_and_open_gap(context):
-    """阶段一:开盘时均线走势和开盘价差检查(一天一次)"""
-    log.info("=" * 60)
-    current_trading_day = get_current_trading_day(context.current_dt)
-    log.info(f"执行均线走势和开盘价差检查 - 时间: {context.current_dt}, 交易日: {current_trading_day}")
-    log.info("=" * 60)
-    
-    # 先检查换月移仓
-    position_auto_switch(context)
-    
-    # 检查是否进入新交易日,必要时清空缓存
-    if g.last_ma_trading_day != current_trading_day:
-        if g.excluded_contracts:
-            log.info(f"交易日切换至 {current_trading_day},清空上一交易日的排除缓存")
-        g.excluded_contracts = {}
-        g.ma_checked_underlyings = {}
-        g.last_ma_trading_day = current_trading_day
-
-    # 获取当前时间
-    current_time = str(context.current_dt.time())[:5]  # HH:MM格式
-    
-    # 筛选可交易品种(根据交易开始时间判断)
-    focus_symbols = g.strategy_focus_symbols if g.strategy_focus_symbols else list(g.futures_config.keys())
-    tradable_symbols = []
-    
-    # 根据当前时间确定可交易的时段
-    # 21:05 -> 仅接受21:00开盘的合约
-    # 09:05 -> 接受09:00或21:00开盘的合约
-    # 09:35 -> 接受所有时段(21:00, 09:00, 09:30)的合约
-    for symbol in focus_symbols:
-        trading_start_time = get_futures_config(symbol, 'trading_start_time', '09:05')
-        should_trade = False
-        
-        if current_time == '21:05':
-            # 夜盘开盘:仅接受21:00开盘的品种
-            should_trade = trading_start_time.startswith('21:00')
-        elif current_time == '09:05':
-            # 日盘早盘:接受21:00和09:00开盘的品种
-            should_trade = trading_start_time.startswith('21:00') or trading_start_time.startswith('09:00')
-        elif current_time == '09:35':
-            # 日盘晚开:接受所有品种(21:00, 09:00, 09:30)
-            should_trade = True
-        
-        if should_trade:
-            tradable_symbols.append(symbol)
-    
-    if not tradable_symbols:
-        log.info(f"当前时间 {current_time} 无品种开盘,跳过检查")
-        return
-    
-    log.info(f"当前时间 {current_time} 开盘品种: {tradable_symbols}")
-    
-    # 对每个品种执行均线和开盘价差检查
-    for symbol in tradable_symbols:
-        if g.ma_checked_underlyings.get(symbol) == current_trading_day:
-            log.info(f"{symbol} 已在交易日 {current_trading_day} 完成均线检查,跳过本次执行")
-            continue
-
-        try:
-            g.ma_checked_underlyings[symbol] = current_trading_day
-            # 获取主力合约
-            dominant_future = get_dominant_future(symbol)
-            # log.debug(f"{symbol} 主力合约: {dominant_future}")
-            if not dominant_future:
-                log.info(f"{symbol} 未找到主力合约,跳过")
-                continue
-            
-            # 检查是否在排除缓存中(当日已检查过但不符合条件)
-            if dominant_future in g.excluded_contracts:
-                excluded_info = g.excluded_contracts[dominant_future]
-                if excluded_info['trading_day'] == current_trading_day:
-                    # log.debug(f"{symbol} 在排除缓存中(原因: {excluded_info['reason']}),跳过")
-                    continue
-                else:
-                    # 新的一天,从缓存中移除(会在after_market_close统一清理,这里也做兜底)
-                    del g.excluded_contracts[dominant_future]
-            
-            # 检查是否已有持仓
-            if check_symbol_prefix_match(dominant_future, set(g.trade_history.keys())):
-                log.info(f"{symbol} 已有持仓,跳过")
-                continue
-            
-            # 获取历史数据(需要足够计算MA30)
-            # 使用get_price获取数据,可以正确处理夜盘品种
-            # 注意:historical_data最后一行是昨天的数据,不包含今天的数据
-            historical_data = get_price(dominant_future, end_date=context.current_dt, 
-                                       frequency='1d', fields=['open', 'close', 'high', 'low'], 
-                                       count=g.ma_historical_days)
-            
-            if historical_data is None or len(historical_data) < max(g.ma_periods):
-                log.info(f"{symbol} 历史数据不足,跳过")
-                continue
-
-            previous_trade_days = get_trade_days(end_date=current_trading_day, count=2)
-            previous_trade_days = [normalize_trade_day_value(d) for d in previous_trade_days]
-            previous_trading_day = None
-            if len(previous_trade_days) >= 2:
-                previous_trading_day = previous_trade_days[-2]
-            elif len(previous_trade_days) == 1 and previous_trade_days[0] < current_trading_day:
-                previous_trading_day = previous_trade_days[0]
-
-            if previous_trading_day is None:
-                log.info(f"{symbol} 无法确定前一交易日,跳过")
-                continue
-
-            historical_dates = historical_data.index.date
-            match_indices = np.where(historical_dates == previous_trading_day)[0]
-
-            if len(match_indices) == 0:
-                earlier_indices = np.where(historical_dates < previous_trading_day)[0]
-                if len(earlier_indices) == 0:
-                    log.info(f"{symbol} 历史数据缺少 {previous_trading_day} 之前的记录,跳过")
-                    continue
-                match_indices = [earlier_indices[-1]]
-
-            data_upto_yesterday = historical_data.iloc[:match_indices[-1] + 1]
-            # log.debug(f"data_upto_yesterday: {data_upto_yesterday}")
-            yesterday_data = data_upto_yesterday.iloc[-1]
-            yesterday_close = yesterday_data['close']
-            
-            # 获取今天的开盘价(使用get_current_data API)
-            current_data = get_current_data()[dominant_future]
-            today_open = current_data.day_open
-            
-            # log.info(f"  历史数据时间范围: {historical_data.index[0]} 至 {historical_data.index[-1]}")
-            
-            # 计算昨天的均线值(使用截至前一交易日的数据)
-            ma_values = calculate_ma_values(data_upto_yesterday, g.ma_periods)
-            ma_proximity_counts = calculate_ma_proximity_counts(data_upto_yesterday, g.ma_periods, g.ma_pattern_lookback_days)
-            
-            log.info(f"{symbol}({dominant_future}) 均线检查:")
-            # log.debug(f"yesterday_data: {yesterday_data}")
-            # log.info(f"  昨收: {yesterday_close:.2f}, 今开: {today_open:.2f}")
-            # log.info(f"  昨日均线 - MA5: {ma_values['MA5']:.2f}, MA10: {ma_values['MA10']:.2f}, "
-            #         f"MA20: {ma_values['MA20']:.2f}, MA30: {ma_values['MA30']:.2f}")
-            log.info(f"  均线贴近统计: {ma_proximity_counts}")
-            proximity_sum = ma_proximity_counts.get('MA5', 0) + ma_proximity_counts.get('MA10', 0)
-            if proximity_sum < g.ma_proximity_min_threshold:
-                log.info(f"  {symbol}({dominant_future}) ✗ 均线贴近计数不足,MA5+MA10={proximity_sum} < {g.ma_proximity_min_threshold},跳过")
-                g.excluded_contracts[dominant_future] = {
-                    'reason': 'ma_proximity',
-                    'trading_day': current_trading_day
-                }
-                continue
-            
-            # 判断均线走势(使用新的灵活模式检查)
-            direction = None
-            if check_ma_pattern(ma_values, 'long'):
-                direction = 'long'
-                # log.info(f"  {symbol}({dominant_future}) 均线走势判断: 多头排列")
-            elif check_ma_pattern(ma_values, 'short'):
-                direction = 'short'
-                # log.info(f"  {symbol}({dominant_future}) 均线走势判断: 空头排列")
-            else:
-                # log.info(f"  均线走势判断: 不符合多头或空头排列,跳过")
-                # 将不符合条件的合约加入排除缓存
-                g.excluded_contracts[dominant_future] = {
-                    'reason': 'ma_trend',
-                    'trading_day': current_trading_day
-                }
-                continue
-            
-            # 检查历史均线模式一致性
-            consistency_passed, consistency_ratio = check_historical_ma_pattern_consistency(
-                historical_data, direction, g.ma_pattern_lookback_days, g.ma_pattern_consistency_threshold
-            )
-            
-            if not consistency_passed:
-                log.info(f"  {symbol}({dominant_future}) ✗ 历史均线模式一致性不足 "
-                        f"({consistency_ratio:.1%} < {g.ma_pattern_consistency_threshold:.1%}),跳过")
-                g.excluded_contracts[dominant_future] = {
-                    'reason': 'ma_consistency',
-                    'trading_day': current_trading_day
-                }
-                continue
-            else:
-                log.info(f"  {symbol}({dominant_future}) ✓ 历史均线模式一致性检查通过 "
-                        f"({consistency_ratio:.1%} >= {g.ma_pattern_consistency_threshold:.1%})")
-            
-            # 计算开盘价差比例
-            open_gap_ratio = (today_open - yesterday_close) / yesterday_close
-            
-            log.info(f"  开盘价差检查: 昨收 {yesterday_close:.2f}, 今开 {today_open:.2f}, "
-                    f"价差比例 {open_gap_ratio:.2%}")
-            
-            # 检查开盘价差是否符合方向要求
-            gap_check_passed = False
-            
-            if g.ma_gap_strategy_mode == 1:
-                # 方案1:多头检查上跳,空头检查下跳
-                if direction == 'long' and open_gap_ratio >= g.ma_open_gap_threshold:
-                    log.info(f"  {symbol}({dominant_future}) ✓ 方案1多头开盘价差检查通过 ({open_gap_ratio:.2%} >= {g.ma_open_gap_threshold:.2%})")
-                    gap_check_passed = True
-                elif direction == 'short' and open_gap_ratio <= -g.ma_open_gap_threshold:
-                    log.info(f"  {symbol}({dominant_future}) ✓ 方案1空头开盘价差检查通过 ({open_gap_ratio:.2%} <= {-g.ma_open_gap_threshold:.2%})")
-                    gap_check_passed = True
-            elif g.ma_gap_strategy_mode == 2:
-                # 方案2:多头检查下跳,空头检查上跳
-                if direction == 'long' and open_gap_ratio <= -g.ma_open_gap_threshold2:
-                    log.info(f"  {symbol}({dominant_future}) ✓ 方案2多头开盘价差检查通过 ({open_gap_ratio:.2%} <= {-g.ma_open_gap_threshold2:.2%})")
-                    gap_check_passed = True
-                elif direction == 'short' and open_gap_ratio >= g.ma_open_gap_threshold2:
-                    log.info(f"  {symbol}({dominant_future}) ✓ 方案2空头开盘价差检查通过 ({open_gap_ratio:.2%} >= {g.ma_open_gap_threshold2:.2%})")
-                    gap_check_passed = True
-            
-            if not gap_check_passed:
-                # log.info(f"  ✗ 开盘价差不符合方案{g.ma_gap_strategy_mode} {direction}方向要求,跳过")
-                # 将不符合条件的合约加入排除缓存
-                g.excluded_contracts[dominant_future] = {
-                    'reason': 'open_gap',
-                    'trading_day': current_trading_day
-                }
-                continue
-            
-            # 将通过检查的品种加入候选列表
-            g.daily_ma_candidates[dominant_future] = {
-                'symbol': symbol,
-                'direction': direction,
-                'open_price': today_open,
-                'yesterday_close': yesterday_close,
-                'ma_values': ma_values
-            }
-            
-            log.info(f"  ✓✓ {symbol} 通过均线和开盘价差检查,加入候选列表")
-            
-        except Exception as e:
-            g.ma_checked_underlyings.pop(symbol, None)
-            log.warning(f"{symbol} 检查时出错: {str(e)}")
-            continue
-    
-    log.info(f"候选列表更新完成,当前候选品种: {list(g.daily_ma_candidates.keys())}")
-    log.info("=" * 60)
-
-def check_intraday_price_diff(context):
-    """阶段二:盘中价差检查和开仓(14:35和14:55)"""
-    log.info("=" * 60)
-    log.info(f"执行当天价差检查和开仓逻辑 - 时间: {context.current_dt}")
-    log.info("=" * 60)
-    
-    # 先检查换月移仓
-    position_auto_switch(context)
-    
-    if not g.daily_ma_candidates:
-        log.info("当前无候选品种,跳过")
-        return
-    
-    log.info(f"候选品种数量: {len(g.daily_ma_candidates)}")
-    
-    # 遍历候选品种
-    candidates_to_remove = []
-    
-    for dominant_future, candidate_info in g.daily_ma_candidates.items():
-        try:
-            symbol = candidate_info['symbol']
-            direction = candidate_info['direction']
-            open_price = candidate_info['open_price']
-            
-            # 再次检查是否已有持仓
-            if check_symbol_prefix_match(dominant_future, set(g.trade_history.keys())):
-                log.info(f"{symbol} 已有持仓,从候选列表移除")
-                candidates_to_remove.append(dominant_future)
-                continue
-            
-            # 获取当前价格
-            current_data = get_current_data()[dominant_future]
-            current_price = current_data.last_price
-            
-            # 计算当天价差
-            intraday_diff = current_price - open_price
-            intraday_diff_ratio = intraday_diff / open_price  # 计算相对变化比例
-            
-            log.info(f"{symbol}({dominant_future}) 当天价差检查:")
-            log.info(f"  方向: {direction}, 开盘价: {open_price:.2f}, 当前价: {current_price:.2f}, "
-                    f"当天价差: {intraday_diff:.2f}, 变化比例: {intraday_diff_ratio:.2%}")
-            
-            # 判断是否满足开仓条件
-            should_open = False
-            
-            if g.ma_gap_strategy_mode == 1:
-                # 方案1:根据参数决定是否检查日内价差
-                if not g.check_intraday_spread:
-                    # 跳过日内价差检查,直接允许开仓
-                    log.info(f"  方案1跳过日内价差检查(check_intraday_spread=False)")
-                    should_open = True
-                elif direction == 'long' and intraday_diff > 0:
-                    log.info(f"  ✓ 方案1多头当天价差检查通过 ({intraday_diff:.2f} > 0)")
-                    should_open = True
-                elif direction == 'short' and intraday_diff < 0:
-                    log.info(f"  ✓ 方案1空头当天价差检查通过 ({intraday_diff:.2f} < 0)")
-                    should_open = True
-                else:
-                    log.info(f"  ✗ 方案1当天价差不符合{direction}方向要求")
-            elif g.ma_gap_strategy_mode == 2:
-                # 方案2:强制检查日内变化,使用专用阈值
-                if direction == 'long' and intraday_diff_ratio >= g.ma_intraday_threshold_scheme2:
-                    log.info(f"  ✓ 方案2多头日内变化检查通过 ({intraday_diff_ratio:.2%} >= {g.ma_intraday_threshold_scheme2:.2%})")
-                    should_open = True
-                elif direction == 'short' and intraday_diff_ratio <= -g.ma_intraday_threshold_scheme2:
-                    log.info(f"  ✓ 方案2空头日内变化检查通过 ({intraday_diff_ratio:.2%} <= {-g.ma_intraday_threshold_scheme2:.2%})")
-                    should_open = True
-                else:
-                    log.info(f"  ✗ 方案2日内变化不符合{direction}方向要求(阈值: ±{g.ma_intraday_threshold_scheme2:.2%})")
-            
-            if should_open:
-                # 执行开仓
-                log.info(f"  准备开仓: {symbol} {direction}")
-                target_hands = calculate_target_hands(context, dominant_future, direction)
-                
-                if target_hands > 0:
-                    success = open_position(context, dominant_future, target_hands, direction, 
-                                          f'均线形态开仓')
-                    if success:
-                        log.info(f"  ✓✓ {symbol} 开仓成功,从候选列表移除")
-                        candidates_to_remove.append(dominant_future)
-                    else:
-                        log.warning(f"  ✗ {symbol} 开仓失败")
-                else:
-                    log.warning(f"  ✗ {symbol} 计算目标手数为0,跳过开仓")
-                    
-        except Exception as e:
-            log.warning(f"{dominant_future} 处理时出错: {str(e)}")
-            continue
-    
-    # 从候选列表中移除已开仓的品种
-    for future in candidates_to_remove:
-        if future in g.daily_ma_candidates:
-            del g.daily_ma_candidates[future]
-    
-    log.info(f"剩余候选品种: {list(g.daily_ma_candidates.keys())}")
-    log.info("=" * 60)
-
-def check_stop_loss_profit(context):
-    """阶段三:止损止盈检查(所有时间点)"""
-    # 先检查换月移仓
-    position_auto_switch(context)
-    
-    # 获取当前时间
-    current_time = str(context.current_dt.time())[:2]
-    
-    # 判断是否为夜盘时间
-    is_night_session = (current_time in ['21', '22', '23', '00', '01', '02'])
-    
-    # 遍历所有持仓进行止损止盈检查
-    subportfolio = context.subportfolios[0]
-    long_positions = list(subportfolio.long_positions.values())
-    short_positions = list(subportfolio.short_positions.values())
-    
-    closed_count = 0
-    skipped_count = 0
-    
-    for position in long_positions + short_positions:
-        security = position.security
-        underlying_symbol = security.split('.')[0][:-4]
-        
-        # 检查交易时间适配性
-        has_night_session = get_futures_config(underlying_symbol, 'has_night_session', False)
-        
-        # 如果是夜盘时间,但品种不支持夜盘交易,则跳过
-        if is_night_session and not has_night_session:
-            skipped_count += 1
-            continue
-        
-        # 执行止损止盈检查
-        if check_position_stop_loss_profit(context, position):
-            closed_count += 1
-    
-    if closed_count > 0:
-        log.info(f"执行了 {closed_count} 次止损止盈")
-    
-    if skipped_count > 0:
-        log.info(f"夜盘时间跳过 {skipped_count} 个日间品种的止损止盈检查")
-
-def check_position_stop_loss_profit(context, position):
-    """检查单个持仓的止损止盈"""
-    security = position.security
-    
-    if security not in g.trade_history:
-        return False
-    
-    trade_info = g.trade_history[security]
-    direction = trade_info['direction']
-    entry_price = trade_info['entry_price']
-    entry_time = trade_info['entry_time']
-    current_price = position.price
-    
-    # 计算当前盈亏比率
-    if direction == 'long':
-        profit_rate = (current_price - entry_price) / entry_price
-    else:
-        profit_rate = (entry_price - current_price) / entry_price
-    
-    # 检查固定止损
-    if profit_rate <= -g.fixed_stop_loss_rate:
-        log.info(f"触发固定止损 {security} {direction}, 当前亏损率: {profit_rate:.3%}, "
-                f"成本价: {entry_price:.2f}, 当前价格: {current_price:.2f}")
-        close_position(context, security, direction)
-        return True
-    
-    # 检查是否启用均线跟踪止盈
-    if not trade_info.get('ma_trailing_enabled', True):
-        return False
-
-    # 检查均线跟踪止盈
-    # 获取持仓天数
-    entry_date = entry_time.date()
-    current_date = context.current_dt.date()
-    all_trade_days = get_all_trade_days()
-    holding_days = sum((entry_date <= d <= current_date) for d in all_trade_days)
-    
-    # 计算变化率
-    today_price = get_current_data()[security].last_price
-    avg_daily_change_rate = calculate_average_daily_change_rate(security)
-    historical_data = attribute_history(security, 1, '1d', ['close'])
-    yesterday_close = historical_data['close'].iloc[-1]
-    today_change_rate = abs((today_price - yesterday_close) / yesterday_close)
-    
-    # 根据时间判断使用的偏移量
-    current_time = context.current_dt.time()
-    target_time = datetime.strptime('14:55:00', '%H:%M:%S').time()
-    if current_time > target_time:
-        offset_ratio = g.ma_offset_ratio_close
-    else:
-        offset_ratio = g.ma_offset_ratio_normal
-    
-    # 选择止损均线
-    close_line = None
-    if today_change_rate >= 1.5 * avg_daily_change_rate:
-        close_line = 'ma5'  # 波动剧烈时用短周期
-    elif holding_days <= g.days_for_adjustment:
-        close_line = 'ma5'  # 持仓初期用短周期
-    else:
-        close_line = 'ma5' if today_change_rate >= 1.2 * avg_daily_change_rate else 'ma10'
-    
-    # 计算实时均线值
-    ma_values = calculate_realtime_ma_values(security, [5, 10])
-    ma_value = ma_values[close_line]
-    
-    # 应用偏移量
-    if direction == 'long':
-        adjusted_ma_value = ma_value * (1 - offset_ratio)
-    else:
-        adjusted_ma_value = ma_value * (1 + offset_ratio)
-    
-    # 判断是否触发均线止损
-    if (direction == 'long' and today_price < adjusted_ma_value) or \
-       (direction == 'short' and today_price > adjusted_ma_value):
-        log.info(f"触发均线跟踪止盈 {security} {direction}, 止损均线: {close_line}, "
-                f"均线值: {ma_value:.2f}, 调整后: {adjusted_ma_value:.2f}, "
-                f"当前价: {today_price:.2f}, 持仓天数: {holding_days}")
-        close_position(context, security, direction)
-        return True
-    
-    return False
-
-############################ 核心辅助函数 ###################################
-
-def calculate_ma_values(data, periods):
-    """计算均线值
-    
-    Args:
-        data: DataFrame,包含'close'列的历史数据(最后一行是最新的数据)
-        periods: list,均线周期列表,如[5, 10, 20, 30]
-    
-    Returns:
-        dict: {'MA5': value, 'MA10': value, 'MA20': value, 'MA30': value}
-        返回最后一行(最新日期)的各周期均线值
-    """
-    ma_values = {}
-    
-    for period in periods:
-        if len(data) >= period:
-            # 计算最后period天的均线值
-            ma_values[f'MA{period}'] = data['close'].iloc[-period:].mean()
-        else:
-            ma_values[f'MA{period}'] = None
-    
-    return ma_values
-
-
-def calculate_ma_proximity_counts(data, periods, lookback_days):
-    """统计近 lookback_days 天收盘价贴近各均线的次数"""
-    proximity_counts = {f'MA{period}': 0 for period in periods}
-
-    if len(data) < lookback_days:
-        return proximity_counts
-
-    closes = data['close'].iloc[-lookback_days:]
-    ma_series = {
-        period: data['close'].rolling(window=period).mean().iloc[-lookback_days:]
-        for period in periods
-    }
-
-    for idx, close_price in enumerate(closes):
-        min_diff = None
-        closest_period = None
-
-        for period in periods:
-            ma_value = ma_series[period].iloc[idx]
-            if pd.isna(ma_value):
-                continue
-            diff = abs(close_price - ma_value)
-            if min_diff is None or diff < min_diff:
-                min_diff = diff
-                closest_period = period
-
-        if closest_period is not None:
-            proximity_counts[f'MA{closest_period}'] += 1
-
-    return proximity_counts
-
-
-def check_ma_pattern(ma_values, direction):
-    """检查均线排列模式是否符合方向要求
-    
-    Args:
-        ma_values: dict,包含MA5, MA10, MA20, MA30的均线值
-        direction: str,'long'或'short'
-    
-    Returns:
-        bool: 是否符合均线排列要求
-    """
-    ma5 = ma_values['MA5']
-    ma10 = ma_values['MA10']
-    ma20 = ma_values['MA20']
-    ma30 = ma_values['MA30']
-    
-    if direction == 'long':
-        # 多头模式:MA30 <= MA20 <= MA10 <= MA5 或 MA30 <= MA20 <= MA5 <= MA10
-        pattern1 = (ma30 <= ma20 <= ma10 <= ma5)
-        pattern2 = (ma30 <= ma20 <= ma5 <= ma10)
-        return pattern1 or pattern2
-    elif direction == 'short':
-        # 空头模式:MA10 <= MA5 <= MA20 <= MA30 或 MA5 <= MA10 <= MA20 <= MA30
-        pattern1 = (ma10 <= ma5 <= ma20 <= ma30)
-        pattern2 = (ma5 <= ma10 <= ma20 <= ma30)
-        return pattern1 or pattern2
-    else:
-        return False
-
-def check_historical_ma_pattern_consistency(historical_data, direction, lookback_days, consistency_threshold):
-    """检查历史均线模式的一致性
-    
-    Args:
-        historical_data: DataFrame,包含足够天数的历史数据
-        direction: str,'long'或'short'
-        lookback_days: int,检查过去多少天
-        consistency_threshold: float,一致性阈值(0-1之间)
-    
-    Returns:
-        tuple: (bool, float) - (是否通过一致性检查, 实际一致性比例)
-    """
-    if len(historical_data) < max(g.ma_periods) + lookback_days:
-        # 历史数据不足
-        return False, 0.0
-    
-    match_count = 0
-    total_count = lookback_days
-    
-    # 检查过去lookback_days天的均线模式
-    for i in range(lookback_days):
-        # 获取倒数第(i+1)天的数据(i=0时是昨天,i=1时是前天,依此类推)
-        end_idx = -(i + 1)
-        if end_idx == -1:
-            data_slice = historical_data
-        else:
-            data_slice = historical_data.iloc[:end_idx]
-        
-        # 计算该天的均线值
-        ma_values = calculate_ma_values(data_slice, g.ma_periods)
-        
-        # 检查是否符合模式
-        if check_ma_pattern(ma_values, direction):
-            match_count += 1
-    
-    consistency_ratio = match_count / total_count
-    passed = consistency_ratio >= consistency_threshold
-    
-    return passed, consistency_ratio
-
-############################ 交易执行函数 ###################################
-
-def open_position(context, security, target_hands, direction, reason=''):
-    """开仓"""
-    try:
-        # 记录交易前的可用资金
-        cash_before = context.portfolio.available_cash
-        
-        # 使用order_target按手数开仓
-        order = order_target(security, target_hands, side=direction)
-        
-        if order is not None and order.filled > 0:
-            # 记录交易后的可用资金
-            cash_after = context.portfolio.available_cash
-            
-            # 计算实际资金变化
-            cash_change = cash_before - cash_after
-            
-            # 获取订单价格和数量
-            order_price = order.avg_cost if order.avg_cost else order.price
-            order_amount = order.filled
-            
-            # 记录当日交易
-            underlying_symbol = security.split('.')[0][:-4]
-            g.today_trades.append({
-                'security': security,
-                'underlying_symbol': underlying_symbol,
-                'direction': direction,
-                'order_amount': order_amount,
-                'order_price': order_price,
-                'cash_change': cash_change,
-                'time': context.current_dt
-            })
-            
-            # 记录交易信息
-            g.trade_history[security] = {
-                'entry_price': order_price,
-                'target_hands': target_hands,
-                'actual_hands': order_amount,
-                'actual_margin': cash_change,
-                'direction': direction,
-                'entry_time': context.current_dt
-            }
-
-            ma_trailing_enabled = True
-            if direction == 'long':
-                ma_values_at_entry = calculate_realtime_ma_values(security, [5])
-                ma5_value = ma_values_at_entry.get('ma5')
-                if ma5_value is not None and order_price < ma5_value:
-                    ma_trailing_enabled = False
-                    log.info(f"禁用均线跟踪止盈: {security} {direction}, 开仓价 {order_price:.2f} < MA5 {ma5_value:.2f}")
-
-            g.trade_history[security]['ma_trailing_enabled'] = ma_trailing_enabled
-            
-            log.info(f"开仓成功: {security} {direction} {order_amount}手 @{order_price:.2f}, "
-                    f"保证金: {cash_change:.0f}, 原因: {reason}")
-            
-            return True
-            
-    except Exception as e:
-        log.warning(f"开仓失败 {security}: {str(e)}")
-    
-    return False
-
-def close_position(context, security, direction):
-    """平仓"""
-    try:
-        # 使用order_target平仓到0手
-        order = order_target(security, 0, side=direction)
-        
-        if order is not None and order.filled > 0:
-            underlying_symbol = security.split('.')[0][:-4]
-            
-            # 记录当日交易(平仓)
-            g.today_trades.append({
-                'security': security,
-                'underlying_symbol': underlying_symbol,
-                'direction': direction,
-                'order_amount': -order.filled,
-                'order_price': order.avg_cost if order.avg_cost else order.price,
-                'cash_change': 0,
-                'time': context.current_dt
-            })
-            
-            log.info(f"平仓成功: {underlying_symbol} {direction} {order.filled}手")
-            
-            # 从交易历史中移除
-            if security in g.trade_history:
-                del g.trade_history[security]
-            return True
-            
-    except Exception as e:
-        log.warning(f"平仓失败 {security}: {str(e)}")
-    
-    return False
-
-############################ 辅助函数 ###################################
-
-def get_futures_config(underlying_symbol, config_key=None, default_value=None):
-    """获取期货品种配置信息的辅助函数"""
-    if underlying_symbol not in g.futures_config:
-        if config_key and default_value is not None:
-            return default_value
-        return {}
-    
-    if config_key is None:
-        return g.futures_config[underlying_symbol]
-    
-    return g.futures_config[underlying_symbol].get(config_key, default_value)
-
-def get_margin_rate(underlying_symbol, direction, default_rate=0.10):
-    """获取保证金比例的辅助函数"""
-    return g.futures_config.get(underlying_symbol, {}).get('margin_rate', {}).get(direction, default_rate)
-
-def get_multiplier(underlying_symbol, default_multiplier=10):
-    """获取合约乘数的辅助函数"""
-    return g.futures_config.get(underlying_symbol, {}).get('multiplier', default_multiplier)
-
-def calculate_target_hands(context, security, direction):
-    """计算目标开仓手数"""
-    current_price = get_current_data()[security].last_price
-    underlying_symbol = security.split('.')[0][:-4]
-    
-    # 使用保证金比例
-    margin_rate = get_margin_rate(underlying_symbol, direction)
-    multiplier = get_multiplier(underlying_symbol)
-    
-    # 计算单手保证金
-    single_hand_margin = current_price * multiplier * margin_rate
-    
-    # 还要考虑可用资金限制
-    available_cash = context.portfolio.available_cash * g.usage_percentage
-    
-    # 根据单个标的最大持仓保证金限制计算开仓数量
-    max_margin = g.max_margin_per_position
-    
-    if single_hand_margin <= max_margin:
-        # 如果单手保证金不超过最大限制,计算最大可开仓手数
-        max_hands = int(max_margin / single_hand_margin)
-        max_hands_by_cash = int(available_cash / single_hand_margin)
-        
-        # 取两者较小值
-        actual_hands = min(max_hands, max_hands_by_cash)
-        
-        # 确保至少开1手
-        actual_hands = max(1, actual_hands)
-        
-        log.info(f"单手保证金: {single_hand_margin:.0f}, 目标开仓手数: {actual_hands}")
-        
-        return actual_hands
-    else:
-        # 如果单手保证金超过最大限制,默认开仓1手
-        actual_hands = 1
-        
-        log.info(f"单手保证金: {single_hand_margin:.0f} 超过最大限制: {max_margin}, 默认开仓1手")
-        
-        return actual_hands
-
-def check_symbol_prefix_match(symbol, hold_symbols):
-    """检查是否有相似的持仓品种"""
-    symbol_prefix = symbol[:-9]
-    
-    for hold_symbol in hold_symbols:
-        hold_symbol_prefix = hold_symbol[:-9] if len(hold_symbol) > 9 else hold_symbol
-        
-        if symbol_prefix == hold_symbol_prefix:
-            return True
-    return False
-
-def calculate_average_daily_change_rate(security, days=30):
-    """计算日均变化率"""
-    historical_data = attribute_history(security, days + 1, '1d', ['close'])
-    daily_change_rates = abs(historical_data['close'].pct_change()).iloc[1:]
-    return daily_change_rates.mean()
-
-def calculate_realtime_ma_values(security, ma_periods):
-    """计算包含当前价格的实时均线值"""
-    historical_data = attribute_history(security, max(ma_periods), '1d', ['close'])
-    today_price = get_current_data()[security].last_price
-    close_prices = historical_data['close'].tolist() + [today_price]
-    ma_values = {f'ma{period}': sum(close_prices[-period:]) / period for period in ma_periods}
-    return ma_values
-
-def after_market_close(context):
-    """收盘后运行函数"""
-    log.info(str('函数运行时间(after_market_close):'+str(context.current_dt.time())))
-    
-    # 清空候选列表(每天重新检查)
-    g.daily_ma_candidates = {}
-    
-    # 清空排除缓存(每天重新检查)
-    excluded_count = len(g.excluded_contracts)
-    if excluded_count > 0:
-        log.info(f"清空排除缓存,共 {excluded_count} 个合约")
-        g.excluded_contracts = {}
-    
-    # 只有当天有交易时才打印统计信息
-    if g.today_trades:
-        print_daily_trading_summary(context)
-        
-        # 清空当日交易记录
-        g.today_trades = []
-    
-    log.info('##############################################################')
-
-def print_daily_trading_summary(context):
-    """打印当日交易汇总"""
-    if not g.today_trades:
-        return
-    
-    log.info("\n=== 当日交易汇总 ===")
-    total_margin = 0
-    
-    for trade in g.today_trades:
-        if trade['order_amount'] > 0:  # 开仓
-            log.info(f"开仓 {trade['underlying_symbol']} {trade['direction']} {trade['order_amount']}手 "
-                  f"价格:{trade['order_price']:.2f} 保证金:{trade['cash_change']:.0f}")
-            total_margin += trade['cash_change']
-        else:  # 平仓
-            log.info(f"平仓 {trade['underlying_symbol']} {trade['direction']} {abs(trade['order_amount'])}手 "
-                  f"价格:{trade['order_price']:.2f}")
-    
-    log.info(f"当日保证金占用: {total_margin:.0f}")
-    log.info("==================\n")
-
-########################## 自动移仓换月函数 #################################
-def position_auto_switch(context, pindex=0, switch_func=None, callback=None):
-    """期货自动移仓换月"""
-    import re
-    subportfolio = context.subportfolios[pindex]
-    symbols = set(subportfolio.long_positions.keys()) | set(subportfolio.short_positions.keys())
-    switch_result = []
-    for symbol in symbols:
-        match = re.match(r"(?P<underlying_symbol>[A-Z]{1,})", symbol)
-        if not match:
-            raise ValueError("未知期货标的: {}".format(symbol))
-        else:
-            dominant = get_dominant_future(match.groupdict()["underlying_symbol"])
-            cur = get_current_data()
-            symbol_last_price = cur[symbol].last_price
-            dominant_last_price = cur[dominant].last_price
-            
-            if dominant > symbol:
-                for positions_ in (subportfolio.long_positions, subportfolio.short_positions):
-                    if symbol not in positions_.keys():
-                        continue
-                    else :
-                        p = positions_[symbol]
-
-                    if switch_func is not None:
-                        switch_func(context, pindex, p, dominant)
-                    else:
-                        amount = p.total_amount
-                        # 跌停不能开空和平多,涨停不能开多和平空
-                        if p.side == "long":
-                            symbol_low_limit = cur[symbol].low_limit
-                            dominant_high_limit = cur[dominant].high_limit
-                            if symbol_last_price <= symbol_low_limit:
-                                log.warning("标的{}跌停,无法平仓。移仓换月取消。".format(symbol))
-                                continue
-                            elif dominant_last_price >= dominant_high_limit:
-                                log.warning("标的{}涨停,无法开仓。移仓换月取消。".format(dominant))
-                                continue
-                            else:
-                                log.info("进行移仓换月: ({0},long) -> ({1},long)".format(symbol, dominant))
-                                order_old = order_target(symbol, 0, side='long')
-                                if order_old != None and order_old.filled > 0:
-                                    order_new = order_target(dominant, amount, side='long')
-                                    if order_new != None and order_new.filled > 0:
-                                        switch_result.append({"before": symbol, "after": dominant, "side": "long"})
-                                        # 换月成功,更新交易记录
-                                        if symbol in g.trade_history:
-                                            g.trade_history[dominant] = g.trade_history[symbol]
-                                            del g.trade_history[symbol]
-                                    else:
-                                        log.warning("标的{}交易失败,无法开仓。移仓换月失败。".format(dominant))
-                        if p.side == "short":
-                            symbol_high_limit = cur[symbol].high_limit
-                            dominant_low_limit = cur[dominant].low_limit
-                            if symbol_last_price >= symbol_high_limit:
-                                log.warning("标的{}涨停,无法平仓。移仓换月取消。".format(symbol))
-                                continue
-                            elif dominant_last_price <= dominant_low_limit:
-                                log.warning("标的{}跌停,无法开仓。移仓换月取消。".format(dominant))
-                                continue
-                            else:
-                                log.info("进行移仓换月: ({0},short) -> ({1},short)".format(symbol, dominant))
-                                order_old = order_target(symbol, 0, side='short')
-                                if order_old != None and order_old.filled > 0:
-                                    order_new = order_target(dominant, amount, side='short')
-                                    if order_new != None and order_new.filled > 0:
-                                        switch_result.append({"before": symbol, "after": dominant, "side": "short"})
-                                        # 换月成功,更新交易记录
-                                        if symbol in g.trade_history:
-                                            g.trade_history[dominant] = g.trade_history[symbol]
-                                            del g.trade_history[symbol]
-                                    else:
-                                        log.warning("标的{}交易失败,无法开仓。移仓换月失败。".format(dominant))
-                        if callback:
-                            callback(context, pindex, p, dominant)
-    return switch_result
-

+ 35 - 14
Lib/future/MAPatternStrategy_v002_核心逻辑.md → Lib/future/MAPatternStrategy_v002.md

@@ -19,6 +19,7 @@
 4. 历史模式一致性
 5. **跳空方向检查**(必选)- 趋势跟随或逆势操作
 6. **跳空幅度检查**(可选)- 是否达到阈值
+7. **MA5分布过滤**(可选)- 收盘价相对于MA5的分布情况
 
 ### 第二阶段:最终价格验证和开仓
 **位置**:`check_open_and_stop`函数
@@ -59,16 +60,26 @@
    - **多头模式**:满足以下任一模式
      - MA30 ≤ MA20 ≤ MA10 ≤ MA5
      - MA30 ≤ MA20 ≤ MA5 ≤ MA10
+     - MA20 ≤ MA30 ≤ MA10 ≤ MA5
+     - MA20 ≤ MA30 ≤ MA5 ≤ MA10
    - **空头模式**:满足以下任一模式
      - MA10 ≤ MA5 ≤ MA20 ≤ MA30
      - MA5 ≤ MA10 ≤ MA20 ≤ MA30
+     - MA10 ≤ MA5 ≤ MA30 ≤ MA20
+     - MA5 ≤ MA10 ≤ MA30 ≤ MA20
 
 **4. 历史均线模式一致性检查**
    - 检查过去10天的均线模式一致性
    - 一致性比例 ≥ 80%(`g.ma_pattern_consistency_threshold = 0.8`)
    - 即10天中至少有8天符合当前方向的均线排列
 
-**5. 跳空方向检查**(第一部分,必选)
+**5. MA5分布过滤**(当`g.enable_ma_distribution_filter = True`时启用)
+   - 检查过去5天收盘价相对于MA5的分布情况
+   - 多头:收盘价 < MA5的天数需满足最低比例要求
+   - 空头:收盘价 > MA5的天数需满足最低比例要求
+   - 默认要求满足天数 ≥ 40%(`g.ma_distribution_min_ratio = 0.4`)
+
+**6. 跳空方向检查**(第一部分,必选)
    - **策略1 - 趋势跟随**(`g.ma_gap_strategy_mode = 1`):
      - 看涨趋势:必须向上跳空(开盘价 > 昨收价)
      - 看跌趋势:必须向下跳空(开盘价 < 昨收价)
@@ -79,11 +90,9 @@
      - 看涨趋势:必须向下跳空(开盘价 < 昨收价)
      - 看跌趋势:必须向上跳空(开盘价 > 昨收价)
 
-**6. 跳空幅度检查**(第二部分,可选)
-   - **选项A**(`g.check_gap_magnitude = True`):检查跳空幅度与阈值比较
-     - 策略1:`|开盘价差比例| >= 0.2%`(`g.ma_open_gap_threshold = 0.002`)
-     - 策略2/3:`|开盘价差比例| >= 0.2%`(`g.ma_open_gap_threshold2 = 0.002`)
-   - **选项B**(`g.check_gap_magnitude = False`):不验证跳空幅度,只要方向正确即可
+**7. 跳空幅度检查**(第二部分,必选)
+   - **策略1**:`|开盘价差比例| >= 0.1%`(`g.ma_open_gap_threshold = 0.001`)
+   - **策略2/3**:`|开盘价差比例| >= 0.1%`(`g.ma_open_gap_threshold2 = 0.001`)
 
 #### 通过条件后的处理
 - 将品种加入候选列表 `g.daily_ma_candidates`
@@ -118,11 +127,20 @@
    - 看跌趋势:当前价格 ≤ (前一日开盘价 + 前一日收盘价) / 2
    - 说明:确保价格在下跳后回归到前一日开盘收盘均值附近
 
+#### 均线穿越得分检查
+- 计算开盘价与当前价相对于各周期均线的穿越情况
+- 多头:价格上涨穿越均线加分,下跌穿越减分
+- 空头:价格下跌穿越均线加分,上涨穿越减分
+- 根据检查时间采用不同的阈值要求:
+  - **14:55时间点**:累计得分需 ≥ 1(`g.ma_cross_threshold = 1`)
+  - **其他时间点**:累计得分需 ≥ 2(`g.ma_cross_threshold + 1 = 2`)
+- 该机制旨在14:55收盘前给予更多开仓机会,而在盘中其他时间点要求更强的趋势确认
+
 #### 开仓执行流程
 1. 计算目标手数:
    - 单手保证金 = 当前价 × 合约乘数 × 保证金比例
    - 最大开仓手数 = min(最大保证金限制/单手保证金, 可用资金×80%/单手保证金)
-   - 单个标的最大持仓保证金限制:20,000元(`g.max_margin_per_position = 20000`)
+   - 单个标的最大持仓保证金限制:30,000元(`g.max_margin_per_position = 30000`)
 2. 执行开仓并记录交易信息
 3. 从候选列表中移除已开仓品种
 
@@ -195,6 +213,7 @@
 - 在每次策略执行前检查
 - 当主力合约发生变化时自动移仓
 - 考虑涨跌停板限制,避免极端情况下的移仓失败
+- 夜盘订单状态检查:若订单状态为'new'且无夜盘,则禁止当晚所有操作
 
 ### 缓存机制
 - **排除缓存**:记录当日不符合条件的合约,避免重复检查
@@ -203,35 +222,37 @@
 
 ### 风险控制参数
 - 最大资金使用比例:80%(`g.usage_percentage = 0.8`)
-- 单个标的最大持仓保证金:20,000元
+- 单个标的最大持仓保证金:30,000元
 - 固定止损比例:1%
 - 均线贴近度最低要求:8次
 - 极端趋势过滤阈值:4天
 - 历史一致性要求:80%
 
-### 策略参数配置
+---
+
+## 策略参数配置
 
 **基础参数**
 - 均线周期:[5, 10, 20, 30]
 - 历史数据天数:60天(确保足够计算MA30)
 - 历史均线模式检查天数:10天
 
-**三种策略模式**(`g.ma_gap_strategy_mode`)
+**当前策略模式**(`g.ma_gap_strategy_mode = 3`)
 
 **策略1 - 趋势跟随**:
 - 跳空方向:与趋势一致(看涨上跳/看跌下跳)
-- 跳空幅度:可选检查(`g.check_gap_magnitude`)
+- 跳空幅度:必选检查,阈值0.1%
 - 最终价格验证:日内价差可选检查(`g.check_intraday_spread`)
 - 适用场景:追随趋势方向的强势突破
 
 **策略2 - 逆势操作+强制阈值**:
 - 跳空方向:与趋势相反(看涨下跳/看跌上跳)
-- 跳空幅度:可选检查(`g.check_gap_magnitude`)
+- 跳空幅度:必选检查,阈值0.1%
 - 最终价格验证:强制日内变化阈值≥0.5%(`g.ma_intraday_threshold_scheme2`)
 - 适用场景:捕捉回调后的强力反弹
 
-**策略3 - 逆势操作+价格回归**:
+**策略3 - 逆势操作+价格回归**(当前使用)
 - 跳空方向:与趋势相反(看涨下跳/看跌上跳)
-- 跳空幅度:可选检查(`g.check_gap_magnitude`)
+- 跳空幅度:必选检查,阈值0.1%
 - 最终价格验证:价格回归到前日开盘收盘均值
 - 适用场景:捕捉回调后的均值回归行情

File diff suppressed because it is too large
+ 586 - 142
Lib/future/MAPatternStrategy_v002.py


+ 1617 - 0
Lib/future/MAPatternStrategy_v002_bak.py

@@ -0,0 +1,1617 @@
+# 导入函数库
+from jqdata import *
+from jqdata import finance
+import pandas as pd
+import numpy as np
+import math
+from datetime import date, datetime, timedelta, time
+import re
+
+# 顺势交易策略 v001
+# 基于均线走势(前提条件)+ K线形态(开盘价差、当天价差)的期货交易策略
+#
+# 核心逻辑:
+# 1. 开盘时检查均线走势(MA30<=MA20<=MA10<=MA5为多头,反之为空头)
+# 2. 检查开盘价差是否符合方向要求(多头>=0.5%,空头<=-0.5%)
+# 3. 14:35和14:55检查当天价差(多头>0,空头<0),满足条件则开仓
+# 4. 应用固定止损和动态追踪止盈
+# 5. 自动换月移仓
+
+# 设置以便完整打印 DataFrame
+pd.set_option('display.max_rows', None)
+pd.set_option('display.max_columns', None)
+pd.set_option('display.width', None)
+pd.set_option('display.max_colwidth', 20)
+
+## 初始化函数,设定基准等等
+def initialize(context):
+    # 设定沪深300作为基准
+    set_benchmark('000300.XSHG')
+    # 开启动态复权模式(真实价格)
+    set_option('use_real_price', True)
+    # 输出内容到日志
+    log.info('=' * 60)
+    log.info('均线形态交易策略 v001 初始化开始')
+    log.info('策略类型: 均线走势 + K线形态')
+    log.info('=' * 60)
+
+    ### 期货相关设定 ###
+    # 设定账户为金融账户
+    set_subportfolios([SubPortfolioConfig(cash=context.portfolio.starting_cash, type='index_futures')])
+    # 期货类每笔交易时的手续费是: 买入时万分之0.23,卖出时万分之0.23,平今仓为万分之23
+    set_order_cost(OrderCost(open_commission=0.000023, close_commission=0.000023, close_today_commission=0.0023), type='index_futures')
+    
+    # 设置期货交易的滑点
+    set_slippage(StepRelatedSlippage(2))
+    
+    # 初始化全局变量
+    g.usage_percentage = 0.8  # 最大资金使用比例
+    g.max_margin_per_position = 30000  # 单个标的最大持仓保证金(元)
+    
+    # 均线策略参数
+    g.ma_periods = [5, 10, 20, 30]  # 均线周期
+    g.ma_historical_days = 60  # 获取历史数据天数(确保足够计算MA30)
+    g.ma_open_gap_threshold = 0.001  # 方案1开盘价差阈值(0.2%)
+    g.ma_pattern_lookback_days = 10  # 历史均线模式一致性检查的天数
+    g.ma_pattern_consistency_threshold = 0.8  # 历史均线模式一致性阈值(80%)
+    g.check_intraday_spread = False  # 是否检查日内价差(True: 检查, False: 跳过)
+    g.ma_proximity_min_threshold = 8  # MA5与MA10贴近计数和的最低阈值
+    g.ma_pattern_extreme_days_threshold = 4  # 极端趋势天数阈值
+    g.ma_distribution_lookback_days = 5  # MA5分布过滤回溯天数
+    g.ma_distribution_min_ratio = 0.4  # MA5分布满足比例阈值
+    g.enable_ma_distribution_filter = True  # 是否启用MA5分布过滤
+    g.ma_cross_threshold = 1  # 均线穿越数量阈值
+    g.enable_open_gap_filter = True  # 是否启用开盘价差过滤
+    
+    # 均线价差策略方案选择
+    g.ma_gap_strategy_mode = 3  # 策略模式选择(1: 原方案, 2: 新方案, 3: 方案3)
+    g.ma_open_gap_threshold2 = 0.001  # 方案2开盘价差阈值(0.2%)
+    g.ma_intraday_threshold_scheme2 = 0.005  # 方案2日内变化阈值(0.5%)
+    
+    # 止损止盈策略参数
+    g.fixed_stop_loss_rate = 0.01  # 固定止损比率(1%)
+    g.ma_offset_ratio_normal = 0.003  # 均线跟踪止盈常规偏移量(0.3%)
+    g.ma_offset_ratio_close = 0.01  # 均线跟踪止盈收盘前偏移量(1%)
+    g.days_for_adjustment = 4  # 持仓天数调整阈值
+    
+    # 输出策略参数
+    log.info("均线形态策略参数:")
+    log.info(f"  均线周期: {g.ma_periods}")
+    log.info(f"  策略模式: 方案{g.ma_gap_strategy_mode}")
+    log.info(f"  方案1开盘价差阈值: {g.ma_open_gap_threshold:.1%}")
+    log.info(f"  方案2开盘价差阈值: {g.ma_open_gap_threshold2:.1%}")
+    log.info(f"  方案2日内变化阈值: {g.ma_intraday_threshold_scheme2:.1%}")
+    log.info(f"  历史均线模式检查天数: {g.ma_pattern_lookback_days}天")
+    log.info(f"  历史均线模式一致性阈值: {g.ma_pattern_consistency_threshold:.1%}")
+    log.info(f"  极端趋势天数阈值: {g.ma_pattern_extreme_days_threshold}")
+    log.info(f"  均线贴近计数阈值: {g.ma_proximity_min_threshold}")
+    log.info(f"  MA5分布过滤天数: {g.ma_distribution_lookback_days}")
+    log.info(f"  MA5分布最低比例: {g.ma_distribution_min_ratio:.0%}")
+    log.info(f"  启用MA5分布过滤: {g.enable_ma_distribution_filter}")
+    log.info(f"  是否检查日内价差: {g.check_intraday_spread}")
+    log.info(f"  均线穿越阈值: {g.ma_cross_threshold}")
+    log.info(f"  是否启用开盘价差过滤: {g.enable_open_gap_filter}")
+    log.info(f"  固定止损: {g.fixed_stop_loss_rate:.1%}")
+    log.info(f"  均线跟踪止盈常规偏移: {g.ma_offset_ratio_normal:.1%}")
+    log.info(f"  均线跟踪止盈收盘前偏移: {g.ma_offset_ratio_close:.1%}")
+    log.info(f"  持仓天数调整阈值: {g.days_for_adjustment}天")
+    
+    # 期货品种完整配置字典
+    g.futures_config = {
+        # 贵金属
+        'AU': {'has_night_session': True, 'margin_rate': {'long': 0.21, 'short': 0.21}, 'multiplier': 1000, 'trading_start_time': '21:00'},
+        'AG': {'has_night_session': True, 'margin_rate': {'long': 0.22, 'short': 0.22}, 'multiplier': 15, 'trading_start_time': '21:00'},
+        
+        # 有色金属
+        'CU': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'AL': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'ZN': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'PB': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'NI': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 1, 'trading_start_time': '21:00'},
+        'SN': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 1, 'trading_start_time': '21:00'},
+        'SS': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        
+        # 黑色系
+        'RB': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'HC': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'I': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 100, 'trading_start_time': '21:00'},
+        'JM': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 100, 'trading_start_time': '21:00'},
+        'J': {'has_night_session': True, 'margin_rate': {'long': 0.25, 'short': 0.25}, 'multiplier': 60, 'trading_start_time': '21:00'},
+        
+        # 能源化工
+        'SP': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'FU': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'BU': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'RU': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'BR': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'SC': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 1000, 'trading_start_time': '21:00'},
+        'NR': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'LU': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'LC': {'has_night_session': False, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 1, 'trading_start_time': '09:00'},
+        
+        # 化工
+        'FG': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 20, 'trading_start_time': '21:00'},
+        'TA': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'MA': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'SA': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 20, 'trading_start_time': '21:00'},
+        'L': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'V': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'EG': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'PP': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'EB': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'PG': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 20, 'trading_start_time': '21:00'},
+        'PX': {'has_night_session': True, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        
+        # 农产品
+        'RM': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'OI': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'CF': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'SR': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'PF': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'C': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'CS': {'has_night_session': True, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'CY': {'has_night_session': True, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        'A': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'B': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'M': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'Y': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'P': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        
+        # 无夜盘品种
+        'IF': {'has_night_session': False, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 300, 'trading_start_time': '09:30'},
+        'IH': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 300, 'trading_start_time': '09:30'},
+        'IC': {'has_night_session': False, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 200, 'trading_start_time': '09:30'},
+        'IM': {'has_night_session': False, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 200, 'trading_start_time': '09:30'},
+        'AP': {'has_night_session': False, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 10, 'trading_start_time': '09:00'},
+        'CJ': {'has_night_session': False, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '09:00'},
+        'PK': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '09:00'},
+        'JD': {'has_night_session': False, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 10, 'trading_start_time': '09:00'},
+        'LH': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 16, 'trading_start_time': '09:00'},
+        'T': {'has_night_session': False, 'margin_rate': {'long': 0.03, 'short': 0.03}, 'multiplier': 1000000, 'trading_start_time': '09:30'},
+        'PS': {'has_night_session': False, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 3, 'trading_start_time': '09:00'},
+        'UR': {'has_night_session': False, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 20, 'trading_start_time': '09:00'},
+        'MO': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 100, 'trading_start_time': '21:00'},
+        # 'LF': {'has_night_session': False, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 1, 'trading_start_time': '09:30'},
+        'HO': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 100, 'trading_start_time': '09:30'},
+        # 'LR': {'has_night_session': True, 'margin_rate': {'long': 0.21, 'short': 0.21}, 'multiplier': 20, 'trading_start_time': '21:00'},
+        'LG': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 90, 'trading_start_time': '21:00'},
+        # 'FB': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        # 'PM': {'has_night_session': True, 'margin_rate': {'long': 0.2, 'short': 0.2}, 'multiplier': 50, 'trading_start_time': '21:00'},
+        'EC': {'has_night_session': False, 'margin_rate': {'long': 0.23, 'short': 0.23}, 'multiplier': 50, 'trading_start_time': '09:00'},
+        # 'RR': {'has_night_session': True, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 10, 'trading_start_time': '21:00'},
+        'OP': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 40, 'trading_start_time': '09:00'},
+        # 'IO': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 1, 'trading_start_time': '21:00'},
+        'BC': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+        # 'WH': {'has_night_session': False, 'margin_rate': {'long': 0.2, 'short': 0.2}, 'multiplier': 20, 'trading_start_time': '09:00'},
+        'SH': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 30, 'trading_start_time': '21:00'},
+        # 'RI': {'has_night_session': False, 'margin_rate': {'long': 0.21, 'short': 0.21}, 'multiplier': 20, 'trading_start_time': '09:00'},
+        'TS': {'has_night_session': False, 'margin_rate': {'long': 0.015, 'short': 0.015}, 'multiplier': 2000000, 'trading_start_time': '09:30'},
+        # 'JR': {'has_night_session': False, 'margin_rate': {'long': 0.21, 'short': 0.21}, 'multiplier': 20, 'trading_start_time': '09:00'},
+        'AD': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '09:00'},
+        # 'BB': {'has_night_session': False, 'margin_rate': {'long': 0.19, 'short': 0.19}, 'multiplier': 500, 'trading_start_time': '09:00'},
+        'PL': {'has_night_session': False, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 20, 'trading_start_time': '09:00'},
+        # 'RS': {'has_night_session': False, 'margin_rate': {'long': 0.26, 'short': 0.26}, 'multiplier': 10, 'trading_start_time': '09:00'},
+        'SI': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '09:00'},
+        # 'ZC': {'has_night_session': True, 'margin_rate': {'long': 0.56, 'short': 0.56}, 'multiplier': 100, 'trading_start_time': '21:00'},
+        'SM': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '09:00'},
+        'AO': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 20, 'trading_start_time': '21:00'},
+        'TL': {'has_night_session': False, 'margin_rate': {'long': 0.045, 'short': 0.045}, 'multiplier': 1000000, 'trading_start_time': '09:00'},
+        'SF': {'has_night_session': False, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '09:00'},
+        # 'WR': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 10, 'trading_start_time': '09:00'},
+        'PR': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 15, 'trading_start_time': '21:00'},
+        'TF': {'has_night_session': False, 'margin_rate': {'long': 0.022, 'short': 0.022}, 'multiplier': 1000000, 'trading_start_time': '09:00'},
+        # 'VF': {'has_night_session': False, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 1, 'trading_start_time': '09:00'},
+        'BZ': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 30, 'trading_start_time': '09:00'},
+    }
+    
+    # 策略品种选择策略配置
+    # 方案1:全品种策略 - 考虑所有配置的期货品种
+    g.strategy_focus_symbols = []  # 空列表表示考虑所有品种
+    
+    # 方案2:精选品种策略 - 只交易流动性较好的特定品种(如需使用请取消下行注释)
+    # g.strategy_focus_symbols = ['RM', 'CJ', 'CY', 'JD', 'L', 'LC', 'SF', 'SI']
+    
+    log.info(f"品种选择策略: {'全品种策略(覆盖所有配置品种)' if not g.strategy_focus_symbols else '精选品种策略(' + str(len(g.strategy_focus_symbols)) + '个品种)'}")
+    
+    # 交易记录和数据存储
+    g.trade_history = {}  # 持仓记录 {symbol: {'entry_price': xxx, 'direction': xxx, ...}}
+    g.daily_ma_candidates = {}  # 通过均线和开盘价差检查的候选品种 {symbol: {'direction': 'long'/'short', 'open_price': xxx, ...}}
+    g.today_trades = []  # 当日交易记录
+    g.excluded_contracts = {}  # 每日排除的合约缓存 {dominant_future: {'reason': 'ma_trend'/'open_gap', 'trading_day': xxx}}
+    g.ma_checked_underlyings = {}  # 记录各品种在交易日的均线检查状态 {symbol: trading_day}
+    g.last_ma_trading_day = None  # 最近一次均线检查所属交易日
+    
+    # 夜盘禁止操作标志
+    g.night_session_blocked = False  # 标记是否禁止当晚操作
+    g.night_session_blocked_trading_day = None  # 记录被禁止的交易日
+    
+    # 定时任务设置
+    # 夜盘开始(21:05) - 均线和开盘价差检查
+    run_daily(check_ma_trend_and_open_gap, time='21:05:00', reference_security='IF1808.CCFX')
+    
+    # 日盘开始 - 均线和开盘价差检查
+    run_daily(check_ma_trend_and_open_gap, time='09:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_ma_trend_and_open_gap, time='09:35:00', reference_security='IF1808.CCFX')
+    
+    # 夜盘开仓和止损止盈检查
+    run_daily(check_open_and_stop, time='21:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='21:35:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='22:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='22:35:00', reference_security='IF1808.CCFX')
+    
+    # 日盘开仓和止损止盈检查
+    run_daily(check_open_and_stop, time='09:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='09:35:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='10:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='10:35:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='11:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='11:25:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='13:35:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='14:05:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='14:35:00', reference_security='IF1808.CCFX')
+    run_daily(check_open_and_stop, time='14:55:00', reference_security='IF1808.CCFX')
+    run_daily(check_ma_trailing_reactivation, time='14:55:00', reference_security='IF1808.CCFX')
+    
+    # 收盘后
+    run_daily(after_market_close, time='15:30:00', reference_security='IF1808.CCFX')
+    
+    log.info('=' * 60)
+
+############################ 主程序执行函数 ###################################
+
+def get_current_trading_day(current_dt):
+    """根据当前时间推断对应的期货交易日"""
+    current_date = current_dt.date()
+    current_time = current_dt.time()
+
+    trade_days = get_trade_days(end_date=current_date, count=1)
+    if trade_days and trade_days[0] == current_date:
+        trading_day = current_date
+    else:
+        next_days = get_trade_days(start_date=current_date, count=1)
+        trading_day = next_days[0] if next_days else current_date
+
+    if current_time >= time(20, 59):
+        next_trade_days = get_trade_days(start_date=trading_day, count=2)
+        if len(next_trade_days) >= 2:
+            return next_trade_days[1]
+        if len(next_trade_days) == 1:
+            return next_trade_days[0]
+    return trading_day
+
+
+def normalize_trade_day_value(value):
+    """将交易日对象统一转换为 datetime.date"""
+    if isinstance(value, date) and not isinstance(value, datetime):
+        return value
+    if isinstance(value, datetime):
+        return value.date()
+    if hasattr(value, 'to_pydatetime'):
+        return value.to_pydatetime().date()
+    try:
+        return pd.Timestamp(value).date()
+    except Exception:
+        return value
+
+
+def check_ma_trend_and_open_gap(context):
+    """阶段一:开盘时均线走势和开盘价差检查(一天一次)"""
+    log.info("=" * 60)
+    current_trading_day = get_current_trading_day(context.current_dt)
+    log.info(f"执行均线走势和开盘价差检查 - 时间: {context.current_dt}, 交易日: {current_trading_day}")
+    log.info("=" * 60)
+    
+    # 换月移仓检查(在所有部分之前)
+    position_auto_switch(context)
+    
+    # ==================== 第一部分:基础数据获取 ====================
+    
+    # 步骤1:交易日检查和缓存清理
+    if g.last_ma_trading_day != current_trading_day:
+        if g.excluded_contracts:
+            log.info(f"交易日切换至 {current_trading_day},清空上一交易日的排除缓存")
+        g.excluded_contracts = {}
+        g.ma_checked_underlyings = {}
+        g.last_ma_trading_day = current_trading_day
+
+    # 步骤2:获取当前时间和筛选可交易品种
+    current_time = str(context.current_dt.time())[:5]  # HH:MM格式
+    focus_symbols = g.strategy_focus_symbols if g.strategy_focus_symbols else list(g.futures_config.keys())
+    tradable_symbols = []
+    
+    # 根据当前时间确定可交易的时段
+    # 21:05 -> 仅接受21:00开盘的合约
+    # 09:05 -> 接受09:00或21:00开盘的合约
+    # 09:35 -> 接受所有时段(21:00, 09:00, 09:30)的合约
+    for symbol in focus_symbols:
+        trading_start_time = get_futures_config(symbol, 'trading_start_time', '09:05')
+        should_trade = False
+        
+        if current_time == '21:05':
+            should_trade = trading_start_time.startswith('21:00')
+        elif current_time == '09:05':
+            should_trade = trading_start_time.startswith('21:00') or trading_start_time.startswith('09:00')
+        elif current_time == '09:35':
+            should_trade = True
+        
+        if should_trade:
+            tradable_symbols.append(symbol)
+    
+    if not tradable_symbols:
+        log.info(f"当前时间 {current_time} 无品种开盘,跳过检查")
+        return
+    
+    log.info(f"当前时间 {current_time} 开盘品种: {tradable_symbols}")
+    
+    # 步骤3:对每个品种循环处理
+    for symbol in tradable_symbols:
+        # 步骤3.1:检查是否已处理过
+        if g.ma_checked_underlyings.get(symbol) == current_trading_day:
+            log.info(f"{symbol} 已在交易日 {current_trading_day} 完成均线检查,跳过本次执行")
+            continue
+
+        try:
+            g.ma_checked_underlyings[symbol] = current_trading_day
+            
+            # 步骤3.2:获取主力合约
+            dominant_future = get_dominant_future(symbol)
+            if not dominant_future:
+                log.info(f"{symbol} 未找到主力合约,跳过")
+                continue
+            
+            # 步骤3.3:检查排除缓存
+            if dominant_future in g.excluded_contracts:
+                excluded_info = g.excluded_contracts[dominant_future]
+                if excluded_info['trading_day'] == current_trading_day:
+                    continue
+                else:
+                    # 新的一天,从缓存中移除(会在after_market_close统一清理,这里也做兜底)
+                    del g.excluded_contracts[dominant_future]
+            
+            # 步骤3.4:检查是否已有持仓
+            if check_symbol_prefix_match(dominant_future, context, set(g.trade_history.keys())):
+                log.info(f"{symbol} 已有持仓,跳过")
+                continue
+            
+            # 步骤3.5:获取历史数据和前一交易日数据(合并优化)
+            # 获取历史数据(需要足够计算MA30)
+            historical_data = get_price(dominant_future, end_date=context.current_dt, 
+                                       frequency='1d', fields=['open', 'close', 'high', 'low'], 
+                                       count=g.ma_historical_days)
+            
+            if historical_data is None or len(historical_data) < max(g.ma_periods):
+                log.info(f"{symbol} 历史数据不足,跳过")
+                continue
+
+            # 获取前一交易日并在历史数据中匹配
+            previous_trade_days = get_trade_days(end_date=current_trading_day, count=2)
+            previous_trade_days = [normalize_trade_day_value(d) for d in previous_trade_days]
+            previous_trading_day = None
+            if len(previous_trade_days) >= 2:
+                previous_trading_day = previous_trade_days[-2]
+            elif len(previous_trade_days) == 1 and previous_trade_days[0] < current_trading_day:
+                previous_trading_day = previous_trade_days[0]
+
+            if previous_trading_day is None:
+                log.info(f"{symbol} 无法确定前一交易日,跳过")
+                continue
+
+            # 在历史数据中匹配前一交易日
+            historical_dates = historical_data.index.date
+            match_indices = np.where(historical_dates == previous_trading_day)[0]
+            if len(match_indices) == 0:
+                earlier_indices = np.where(historical_dates < previous_trading_day)[0]
+                if len(earlier_indices) == 0:
+                    log.info(f"{symbol} 历史数据缺少 {previous_trading_day} 之前的记录,跳过")
+                    continue
+                match_indices = [earlier_indices[-1]]
+
+            # 提取截至前一交易日的数据,并一次性提取所有需要的字段
+            data_upto_yesterday = historical_data.iloc[:match_indices[-1] + 1]
+            yesterday_data = data_upto_yesterday.iloc[-1]
+            yesterday_close = yesterday_data['close']
+            yesterday_open = yesterday_data['open']
+            
+            # 步骤3.6:获取当前价格数据
+            current_data = get_current_data()[dominant_future]
+            today_open = current_data.day_open
+            
+            # ==================== 第二部分:核心指标计算 ====================
+            
+            # 步骤4:计算均线相关指标(合并优化)
+            ma_values = calculate_ma_values(data_upto_yesterday, g.ma_periods)
+            ma_proximity_counts = calculate_ma_proximity_counts(data_upto_yesterday, g.ma_periods, g.ma_pattern_lookback_days)
+            
+            log.info(f"{symbol}({dominant_future}) 均线检查:")
+            log.info(f"  均线贴近统计: {ma_proximity_counts}")
+            
+            # 检查均线贴近计数
+            proximity_sum = ma_proximity_counts.get('MA5', 0) + ma_proximity_counts.get('MA10', 0)
+            if proximity_sum < g.ma_proximity_min_threshold:
+                log.info(f"  {symbol}({dominant_future}) ✗ 均线贴近计数不足,MA5+MA10={proximity_sum} < {g.ma_proximity_min_threshold},跳过")
+                add_to_excluded_contracts(dominant_future, 'ma_proximity', current_trading_day)
+                continue
+            
+            # 步骤5:计算极端趋势天数
+            extreme_above_count, extreme_below_count = calculate_extreme_trend_days(
+                data_upto_yesterday,
+                g.ma_periods,
+                g.ma_pattern_lookback_days
+            )
+            extreme_total = extreme_above_count + extreme_below_count
+            min_extreme = min(extreme_above_count, extreme_below_count)
+            filter_threshold = max(2, g.ma_pattern_extreme_days_threshold)
+            log.info(
+                f"  极端趋势天数统计: 收盘在所有均线上方 {extreme_above_count} 天, 收盘在所有均线下方 {extreme_below_count} 天, "
+                f"合计 {extreme_total} 天, min(A,B)={min_extreme} (过滤阈值: {filter_threshold})"
+            )
+            if extreme_above_count > 0 and extreme_below_count > 0 and min_extreme >= filter_threshold:
+                log.info(
+                    f"  {symbol}({dominant_future}) ✗ 极端趋势多空同时出现且 min(A,B)={min_extreme} ≥ {filter_threshold},跳过"
+                )
+                add_to_excluded_contracts(dominant_future, 'ma_extreme_trend', current_trading_day)
+                continue
+
+            # 步骤6:判断均线走势
+            direction = None
+            if check_ma_pattern(ma_values, 'long'):
+                direction = 'long'
+            elif check_ma_pattern(ma_values, 'short'):
+                direction = 'short'
+            else:
+                add_to_excluded_contracts(dominant_future, 'ma_trend', current_trading_day)
+                continue
+            
+            # 步骤7:检查MA5分布过滤
+            if g.enable_ma_distribution_filter:
+                distribution_passed, distribution_stats = check_ma5_distribution_filter(
+                    data_upto_yesterday,
+                    g.ma_distribution_lookback_days,
+                    direction,
+                    g.ma_distribution_min_ratio
+                )
+                log.info(
+                    f"  MA5分布过滤: 方向 {direction}, 有效天数 "
+                    f"{distribution_stats['valid_days']}/{distribution_stats['lookback_days']},"
+                    f"满足天数 {distribution_stats['qualified_days']}/{distribution_stats['required_days']}"
+                )
+                if not distribution_passed:
+                    insufficiency = distribution_stats['valid_days'] < distribution_stats['lookback_days']
+                    reason = "有效数据不足" if insufficiency else "满足天数不足"
+                    log.info(
+                        f"  {symbol}({dominant_future}) ✗ MA5分布过滤未通过({reason})"
+                    )
+                    add_to_excluded_contracts(dominant_future, 'ma5_distribution', current_trading_day)
+                    continue
+            
+            # 步骤8:检查历史均线模式一致性
+            consistency_passed, consistency_ratio = check_historical_ma_pattern_consistency(
+                historical_data, direction, g.ma_pattern_lookback_days, g.ma_pattern_consistency_threshold
+            )
+            
+            if not consistency_passed:
+                log.info(f"  {symbol}({dominant_future}) ✗ 历史均线模式一致性不足 "
+                        f"({consistency_ratio:.1%} < {g.ma_pattern_consistency_threshold:.1%}),跳过")
+                add_to_excluded_contracts(dominant_future, 'ma_consistency', current_trading_day)
+                continue
+            else:
+                log.info(f"  {symbol}({dominant_future}) ✓ 历史均线模式一致性检查通过 "
+                        f"({consistency_ratio:.1%} >= {g.ma_pattern_consistency_threshold:.1%})")
+            
+            # 步骤9:检查开盘价差(可配置开关)
+            if g.enable_open_gap_filter:
+                gap_check_passed = check_open_gap_filter(
+                    symbol=symbol,
+                    dominant_future=dominant_future,
+                    direction=direction,
+                    yesterday_close=yesterday_close,
+                    today_open=today_open,
+                    current_trading_day=current_trading_day
+                )
+                if not gap_check_passed:
+                    continue
+            else:
+                log.info("  已关闭开盘价差过滤(enable_open_gap_filter=False),跳过该检查")
+            
+            # 步骤10:将通过检查的品种加入候选列表
+            g.daily_ma_candidates[dominant_future] = {
+                'symbol': symbol,
+                'direction': direction,
+                'open_price': today_open,
+                'yesterday_close': yesterday_close,
+                'yesterday_open': yesterday_open,
+                'ma_values': ma_values
+            }
+            
+            log.info(f"  ✓✓ {symbol} 通过均线和开盘价差检查,加入候选列表")
+            
+        except Exception as e:
+            g.ma_checked_underlyings.pop(symbol, None)
+            log.warning(f"{symbol} 检查时出错: {str(e)}")
+            continue
+    
+    log.info(f"候选列表更新完成,当前候选品种: {list(g.daily_ma_candidates.keys())}")
+    log.info("=" * 60)
+
+def check_open_and_stop(context):
+    """统一的开仓和止损止盈检查函数"""
+    # 先检查换月移仓
+    log.info("=" * 60)
+    current_trading_day = get_current_trading_day(context.current_dt)
+    log.info(f"执行开仓和止损止盈检查 - 时间: {context.current_dt}, 交易日: {current_trading_day}")
+    log.info("=" * 60)
+    log.info(f"先检查换月:")
+    position_auto_switch(context)
+    
+    # 获取当前时间
+    current_time = str(context.current_dt.time())[:2]
+    
+    # 判断是否为夜盘时间
+    is_night_session = (current_time in ['21', '22', '23', '00', '01', '02'])
+    
+    # 检查是否禁止当晚操作
+    if is_night_session and g.night_session_blocked:
+        blocked_trading_day = normalize_trade_day_value(g.night_session_blocked_trading_day) if g.night_session_blocked_trading_day else None
+        current_trading_day_normalized = normalize_trade_day_value(current_trading_day)
+        if blocked_trading_day == current_trading_day_normalized:
+            log.info(f"当晚操作已被禁止(订单状态为'new',无夜盘),跳过所有操作")
+            return
+
+    # 得到当前未完成订单
+    orders = get_open_orders()
+    # 循环,撤销订单
+    if len(orders) == 0:
+        log.debug(f"无未完成订单")
+    else:
+        for _order in orders.values():
+            log.debug(f"order: {_order}")
+            cancel_order(_order)
+    
+    # 第一步:检查开仓条件
+    log.info(f"检查开仓条件:")
+    if g.daily_ma_candidates:
+        log.info("=" * 60)
+        log.info(f"执行开仓检查 - 时间: {context.current_dt}, 候选品种数量: {len(g.daily_ma_candidates)}")
+        
+        # 遍历候选品种
+        candidates_to_remove = []
+        
+        for dominant_future, candidate_info in g.daily_ma_candidates.items():
+            try:
+                symbol = candidate_info['symbol']
+                direction = candidate_info['direction']
+                open_price = candidate_info['open_price']
+                yesterday_close = candidate_info.get('yesterday_close')
+                yesterday_open = candidate_info.get('yesterday_open')
+                
+                # 检查是否已有持仓
+                if check_symbol_prefix_match(dominant_future, context, set(g.trade_history.keys())):
+                    log.info(f"{symbol} 已有持仓,从候选列表移除")
+                    candidates_to_remove.append(dominant_future)
+                    continue
+                
+                # 获取当前价格
+                current_data = get_current_data()[dominant_future]
+                current_price = current_data.last_price
+                
+                # 计算当天价差
+                intraday_diff = current_price - open_price
+                intraday_diff_ratio = intraday_diff / open_price
+                
+                log.info(f"{symbol}({dominant_future}) 开仓条件检查:")
+                log.info(f"  方向: {direction}, 开盘价: {open_price:.2f}, 当前价: {current_price:.2f}, "
+                        f"当天价差: {intraday_diff:.2f}, 变化比例: {intraday_diff_ratio:.2%}")
+                
+                # 判断是否满足开仓条件 - 仅检查均线穿越得分
+                should_open = True
+                log.info(f"  开仓条件简化:仅检查均线穿越得分,跳过策略1,2,3的判断")
+                
+                if should_open:
+                    ma_values = candidate_info.get('ma_values') or {}
+                    cross_score, score_details = calculate_ma_cross_score(open_price, current_price, ma_values, direction)
+
+                    # 根据当前时间调整所需的均线穿越得分阈值
+                    current_time_str = str(context.current_dt.time())[:5]  # HH:MM格式
+                    required_cross_score = g.ma_cross_threshold
+                    if current_time_str != '14:55':
+                        # 在14:55以外的时间,需要更高的得分阈值
+                        required_cross_score = g.ma_cross_threshold + 1
+
+                    log.info(f"  均线穿越得分: {cross_score}, 得分详情: {score_details}, 当前时间: {current_time_str}, 所需阈值: {required_cross_score}")
+
+                    # 检查得分是否满足条件
+                    score_passed = False
+                    if cross_score >= required_cross_score:
+                        # 如果得分达到阈值,检查特殊情况:只有1分且来自MA5
+                        if cross_score == 1 and required_cross_score == 1:
+                            # 检查这一分是否来自MA5
+                            from_ma5_only = len(score_details) == 1 and score_details[0]['period'] == 5
+                            if not from_ma5_only:
+                                score_passed = True
+                                log.info(f"  ✓ 均线穿越得分检查通过:1分且非来自MA5")
+                            else:
+                                log.info(f"  ✗ 均线穿越得分检查未通过:1分且仅来自MA5")
+                        else:
+                            score_passed = True
+                            log.info(f"  ✓ 均线穿越得分检查通过")
+
+                    if not score_passed:
+                        log.info(f"  ✗ 均线穿越得分不足或不符合条件({cross_score} < {required_cross_score} 或 1分来自MA5),跳过开仓")
+                        continue
+                    # 执行开仓
+                    log.info(f"  准备开仓: {symbol} {direction}")
+                    target_hands, single_hand_margin = calculate_target_hands(context, dominant_future, direction)
+                    
+                    if target_hands > 0:
+                        success = open_position(context, dominant_future, target_hands, direction, single_hand_margin,
+                                              f'均线形态开仓')
+                        if success:
+                            log.info(f"  ✓✓ {symbol} 开仓成功,从候选列表移除")
+                            candidates_to_remove.append(dominant_future)
+                        else:
+                            log.warning(f"  ✗ {symbol} 开仓失败")
+                    else:
+                        log.warning(f"  ✗ {symbol} 计算目标手数为0,跳过开仓")
+                        
+            except Exception as e:
+                log.warning(f"{dominant_future} 处理时出错: {str(e)}")
+                continue
+        
+        # 从候选列表中移除已开仓的品种
+        for future in candidates_to_remove:
+            if future in g.daily_ma_candidates:
+                del g.daily_ma_candidates[future]
+        
+        log.info(f"剩余候选品种: {list(g.daily_ma_candidates.keys())}")
+        log.info("=" * 60)
+    
+    # 第二步:检查止损止盈
+    log.info(f"检查止损止盈条件:")
+    subportfolio = context.subportfolios[0]
+    long_positions = list(subportfolio.long_positions.values())
+    short_positions = list(subportfolio.short_positions.values())
+    
+    closed_count = 0
+    skipped_count = 0
+    
+    for position in long_positions + short_positions:
+        security = position.security
+        underlying_symbol = security.split('.')[0][:-4]
+        
+        # 检查交易时间适配性
+        has_night_session = get_futures_config(underlying_symbol, 'has_night_session', False)
+        
+        # 如果是夜盘时间,但品种不支持夜盘交易,则跳过
+        if is_night_session and not has_night_session:
+            skipped_count += 1
+            continue
+        
+        # 执行止损止盈检查
+        if check_position_stop_loss_profit(context, position):
+            closed_count += 1
+    
+    if closed_count > 0:
+        log.info(f"执行了 {closed_count} 次止损止盈")
+    
+    if skipped_count > 0:
+        log.info(f"夜盘时间跳过 {skipped_count} 个日间品种的止损止盈检查")
+
+
+def check_ma_trailing_reactivation(context):
+    """检查是否需要恢复均线跟踪止盈"""
+    subportfolio = context.subportfolios[0]
+    positions = list(subportfolio.long_positions.values()) + list(subportfolio.short_positions.values())
+    
+    if not positions:
+        return
+    
+    reenabled_count = 0
+    current_data = get_current_data()
+    
+    for position in positions:
+        security = position.security
+        trade_info = g.trade_history.get(security)
+        
+        if not trade_info or trade_info.get('ma_trailing_enabled', True):
+            continue
+        
+        direction = trade_info['direction']
+        ma_values = calculate_realtime_ma_values(security, [5])
+        ma5_value = ma_values.get('ma5')
+        
+        if ma5_value is None or security not in current_data:
+            continue
+        
+        today_price = current_data[security].last_price
+        
+        if direction == 'long' and today_price > ma5_value:
+            trade_info['ma_trailing_enabled'] = True
+            reenabled_count += 1
+            log.info(f"恢复均线跟踪止盈: {security} {direction}, 当前价 {today_price:.2f} > MA5 {ma5_value:.2f}")
+        elif direction == 'short' and today_price < ma5_value:
+            trade_info['ma_trailing_enabled'] = True
+            reenabled_count += 1
+            log.info(f"恢复均线跟踪止盈: {security} {direction}, 当前价 {today_price:.2f} < MA5 {ma5_value:.2f}")
+    
+    if reenabled_count > 0:
+        log.info(f"恢复均线跟踪止盈持仓数量: {reenabled_count}")
+
+
+def check_position_stop_loss_profit(context, position):
+    """检查单个持仓的止损止盈"""
+    log.info(f"检查持仓: {position.security}")
+    security = position.security
+    
+    if security not in g.trade_history:
+        return False
+    
+    trade_info = g.trade_history[security]
+    direction = trade_info['direction']
+    entry_price = trade_info['entry_price']
+    entry_time = trade_info['entry_time']
+    entry_trading_day = trade_info.get('entry_trading_day')
+    if entry_trading_day is None:
+        entry_trading_day = get_current_trading_day(entry_time)
+        trade_info['entry_trading_day'] = entry_trading_day
+    if entry_trading_day is not None:
+        entry_trading_day = normalize_trade_day_value(entry_trading_day)
+    current_trading_day = normalize_trade_day_value(get_current_trading_day(context.current_dt))
+    current_price = position.price
+    
+    # 计算当前盈亏比率
+    if direction == 'long':
+        profit_rate = (current_price - entry_price) / entry_price
+    else:
+        profit_rate = (entry_price - current_price) / entry_price
+    
+    # 检查固定止损
+    log.info("=" * 60)
+    log.info(f"检查固定止损:")
+    log.info("=" * 60)
+    if profit_rate <= -g.fixed_stop_loss_rate:
+        log.info(f"{security} {direction} 触发固定止损 {g.fixed_stop_loss_rate:.3%}, 当前亏损率: {profit_rate:.3%}, "
+                f"成本价: {entry_price:.2f}, 当前价格: {current_price:.2f}")
+        close_position(context, security, direction)
+        return True
+    else:
+        log.debug(f"{security} {direction} 未触发固定止损 {g.fixed_stop_loss_rate:.3%}, 当前亏损率: {profit_rate:.3%}, "
+                f"成本价: {entry_price:.2f}, 当前价格: {current_price:.2f}")
+
+    if entry_trading_day is not None and entry_trading_day == current_trading_day:
+        log.info(f"{security} 建仓交易日内跳过动态止盈检查")
+        return False
+
+    # 检查是否启用均线跟踪止盈
+    log.info("=" * 60)
+    log.info(f"检查是否启用均线跟踪止盈:")
+    log.info("=" * 60)
+    if not trade_info.get('ma_trailing_enabled', True):
+        log.debug(f"{security} {direction} 未启用均线跟踪止盈")
+        return False
+
+    # 检查均线跟踪止盈
+    # 获取持仓天数
+    entry_date = entry_time.date()
+    current_date = context.current_dt.date()
+    all_trade_days = get_all_trade_days()
+    holding_days = sum((entry_date <= d <= current_date) for d in all_trade_days)
+    
+    # 计算变化率
+    today_price = get_current_data()[security].last_price
+    avg_daily_change_rate = calculate_average_daily_change_rate(security)
+    historical_data = attribute_history(security, 1, '1d', ['close'])
+    yesterday_close = historical_data['close'].iloc[-1]
+    today_change_rate = abs((today_price - yesterday_close) / yesterday_close)
+    
+    # 根据时间判断使用的偏移量
+    current_time = context.current_dt.time()
+    target_time = datetime.strptime('14:55:00', '%H:%M:%S').time()
+    if current_time > target_time:
+        offset_ratio = g.ma_offset_ratio_close
+        log.debug(f"当前时间是:{current_time},使用偏移量: {offset_ratio:.3%}")
+    else:
+        offset_ratio = g.ma_offset_ratio_normal
+        log.debug(f"当前时间是:{current_time},使用偏移量: {offset_ratio:.3%}")
+    # 选择止损均线
+    close_line = None
+    if today_change_rate >= 1.5 * avg_daily_change_rate:
+        close_line = 'ma5'  # 波动剧烈时用短周期
+    elif holding_days <= g.days_for_adjustment:
+        close_line = 'ma5'  # 持仓初期用短周期
+    else:
+        close_line = 'ma5' if today_change_rate >= 1.2 * avg_daily_change_rate else 'ma10'
+    
+    # 计算实时均线值
+    ma_values = calculate_realtime_ma_values(security, [5, 10])
+    ma_value = ma_values[close_line]
+    
+    # 应用偏移量
+    if direction == 'long':
+        adjusted_ma_value = ma_value * (1 - offset_ratio)
+    else:
+        adjusted_ma_value = ma_value * (1 + offset_ratio)
+    
+    # 判断是否触发均线止损
+    if (direction == 'long' and today_price < adjusted_ma_value) or \
+       (direction == 'short' and today_price > adjusted_ma_value):
+        log.info(f"触发均线跟踪止盈 {security} {direction}, 止损均线: {close_line}, "
+                f"均线值: {ma_value:.2f}, 调整后: {adjusted_ma_value:.2f}, "
+                f"当前价: {today_price:.2f}, 持仓天数: {holding_days}")
+        close_position(context, security, direction)
+        return True
+    else:
+        log.debug(f"未触发均线跟踪止盈 {security} {direction}, 止损均线: {close_line}, "
+                f"均线值: {ma_value:.2f}, 调整后: {adjusted_ma_value:.2f}, "
+                f"当前价: {today_price:.2f}, 持仓天数: {holding_days}")
+    
+    return False
+
+############################ 核心辅助函数 ###################################
+
+def calculate_ma_values(data, periods):
+    """计算均线值
+    
+    Args:
+        data: DataFrame,包含'close'列的历史数据(最后一行是最新的数据)
+        periods: list,均线周期列表,如[5, 10, 20, 30]
+    
+    Returns:
+        dict: {'MA5': value, 'MA10': value, 'MA20': value, 'MA30': value}
+        返回最后一行(最新日期)的各周期均线值
+    """
+    ma_values = {}
+    
+    for period in periods:
+        if len(data) >= period:
+            # 计算最后period天的均线值
+            ma_values[f'MA{period}'] = data['close'].iloc[-period:].mean()
+        else:
+            ma_values[f'MA{period}'] = None
+    
+    return ma_values
+
+
+def calculate_ma_cross_score(open_price, current_price, ma_values, direction):
+    """根据开盘价与当前价统计多周期均线穿越得分
+    返回: (总得分, 得分详情列表)
+    得分详情: [{'period': 5, 'delta': 1}, {'period': 10, 'delta': 1}, ...]
+    """
+    if not ma_values:
+        return 0, []
+    assert direction in ('long', 'short')
+    score = 0
+    score_details = []
+    for period in g.ma_periods:
+        key = f'MA{period}'
+        ma_value = ma_values.get(key)
+        if ma_value is None:
+            continue
+        cross_up = open_price < ma_value and current_price > ma_value
+        cross_down = open_price > ma_value and current_price < ma_value
+        if not (cross_up or cross_down):
+            continue
+        if direction == 'long':
+            delta = 1 if cross_up else -1
+        else:
+            delta = -1 if cross_up else 1
+        score += delta
+        score_details.append({'period': period, 'delta': delta})
+        log.debug(
+            f"  均线穿越[{key}] - 开盘 {open_price:.2f}, 当前 {current_price:.2f}, "
+            f"均线 {ma_value:.2f}, 方向 {direction}, 增量 {delta}, 当前得分 {score}"
+        )
+    return score, score_details
+
+
+def calculate_ma_proximity_counts(data, periods, lookback_days):
+    """统计近 lookback_days 天收盘价贴近各均线的次数"""
+    proximity_counts = {f'MA{period}': 0 for period in periods}
+
+    if len(data) < lookback_days:
+        return proximity_counts
+
+    closes = data['close'].iloc[-lookback_days:]
+    ma_series = {
+        period: data['close'].rolling(window=period).mean().iloc[-lookback_days:]
+        for period in periods
+    }
+
+    for idx, close_price in enumerate(closes):
+        min_diff = None
+        closest_period = None
+
+        for period in periods:
+            ma_value = ma_series[period].iloc[idx]
+            if pd.isna(ma_value):
+                continue
+            diff = abs(close_price - ma_value)
+            if min_diff is None or diff < min_diff:
+                min_diff = diff
+                closest_period = period
+
+        if closest_period is not None:
+            proximity_counts[f'MA{closest_period}'] += 1
+
+    return proximity_counts
+
+
+def calculate_extreme_trend_days(data, periods, lookback_days):
+    """统计过去 lookback_days 天收盘价相对所有均线的极端趋势天数"""
+    if len(data) < lookback_days:
+        return 0, 0
+
+    recent_closes = data['close'].iloc[-lookback_days:]
+    ma_series = {
+        period: data['close'].rolling(window=period).mean().iloc[-lookback_days:]
+        for period in periods
+    }
+
+    above_count = 0
+    below_count = 0
+
+    for idx, close_price in enumerate(recent_closes):
+        ma_values = []
+        valid = True
+
+        for period in periods:
+            ma_value = ma_series[period].iloc[idx]
+            if pd.isna(ma_value):
+                valid = False
+                break
+            ma_values.append(ma_value)
+
+        if not valid or not ma_values:
+            continue
+
+        if all(close_price > ma_value for ma_value in ma_values):
+            above_count += 1
+        elif all(close_price < ma_value for ma_value in ma_values):
+            below_count += 1
+
+    return above_count, below_count
+
+
+def check_ma5_distribution_filter(data, lookback_days, direction, min_ratio):
+    """检查近 lookback_days 天收盘价相对于MA5的分布情况"""
+    stats = {
+        'lookback_days': lookback_days,
+        'valid_days': 0,
+        'qualified_days': 0,
+        'required_days': max(0, math.ceil(lookback_days * min_ratio))
+    }
+
+    if lookback_days <= 0:
+        return True, stats
+
+    if len(data) < max(lookback_days, 5):
+        return False, stats
+
+    recent_closes = data['close'].iloc[-lookback_days:]
+    ma5_series = data['close'].rolling(window=5).mean().iloc[-lookback_days:]
+
+    for close_price, ma5_value in zip(recent_closes, ma5_series):
+        log.debug(f"close_price: {close_price}, ma5_value: {ma5_value}")
+        if pd.isna(ma5_value):
+            continue
+        stats['valid_days'] += 1
+        if direction == 'long' and close_price < ma5_value:
+            stats['qualified_days'] += 1
+        elif direction == 'short' and close_price > ma5_value:
+            stats['qualified_days'] += 1
+
+    if stats['valid_days'] < lookback_days:
+        return False, stats
+
+    return stats['qualified_days'] >= stats['required_days'], stats
+
+
+def check_open_gap_filter(symbol, dominant_future, direction, yesterday_close, today_open, current_trading_day):
+    """开盘价差过滤辅助函数
+
+    根据当前策略模式(`g.ma_gap_strategy_mode`)和对应阈值检查开盘价差是否符合方向要求。
+
+    Args:
+        symbol: 品种代码(如 'AO')
+        dominant_future: 主力合约代码(如 'AO2502.XSGE')
+        direction: 方向,'long' 或 'short'
+        yesterday_close: 前一交易日收盘价
+        today_open: 当日开盘价
+        current_trading_day: 当前期货交易日
+
+    Returns:
+        bool: True 表示通过开盘价差过滤,False 表示未通过(并已加入排除缓存)
+    """
+    open_gap_ratio = (today_open - yesterday_close) / yesterday_close
+    log.info(
+        f"  开盘价差检查: 昨收 {yesterday_close:.2f}, 今开 {today_open:.2f}, "
+        f"价差比例 {open_gap_ratio:.2%}"
+    )
+
+    gap_check_passed = False
+
+    if g.ma_gap_strategy_mode == 1:
+        # 方案1:多头检查上跳,空头检查下跳
+        if direction == 'long' and open_gap_ratio >= g.ma_open_gap_threshold:
+            log.info(
+                f"  {symbol}({dominant_future}) ✓ 方案1多头开盘价差检查通过 "
+                f"({open_gap_ratio:.2%} >= {g.ma_open_gap_threshold:.2%})"
+            )
+            gap_check_passed = True
+        elif direction == 'short' and open_gap_ratio <= -g.ma_open_gap_threshold:
+            log.info(
+                f"  {symbol}({dominant_future}) ✓ 方案1空头开盘价差检查通过 "
+                f"({open_gap_ratio:.2%} <= {-g.ma_open_gap_threshold:.2%})"
+            )
+            gap_check_passed = True
+    elif g.ma_gap_strategy_mode == 2 or g.ma_gap_strategy_mode == 3:
+        # 方案2和方案3:多头检查下跳,空头检查上跳
+        if direction == 'long' and open_gap_ratio <= -g.ma_open_gap_threshold2:
+            log.info(
+                f"  {symbol}({dominant_future}) ✓ 方案{g.ma_gap_strategy_mode}多头开盘价差检查通过 "
+                f"({open_gap_ratio:.2%} <= {-g.ma_open_gap_threshold2:.2%})"
+            )
+            gap_check_passed = True
+        elif direction == 'short' and open_gap_ratio >= g.ma_open_gap_threshold2:
+            log.info(
+                f"  {symbol}({dominant_future}) ✓ 方案{g.ma_gap_strategy_mode}空头开盘价差检查通过 "
+                f"({open_gap_ratio:.2%} >= {g.ma_open_gap_threshold2:.2%})"
+            )
+            gap_check_passed = True
+
+    if not gap_check_passed:
+        add_to_excluded_contracts(dominant_future, 'open_gap', current_trading_day)
+        return False
+
+    return True
+
+
+
+def check_ma_pattern(ma_values, direction):
+    """检查均线排列模式是否符合方向要求
+    
+    Args:
+        ma_values: dict,包含MA5, MA10, MA20, MA30的均线值
+        direction: str,'long'或'short'
+    
+    Returns:
+        bool: 是否符合均线排列要求
+    """
+    ma5 = ma_values['MA5']
+    ma10 = ma_values['MA10']
+    ma20 = ma_values['MA20']
+    ma30 = ma_values['MA30']
+    
+    if direction == 'long':
+        # 多头模式:MA30 <= MA20 <= MA10 <= MA5 或 MA30 <= MA20 <= MA5 <= MA10
+        # 或者:MA20 <= MA30 <= MA10 <= MA5 或 MA20 <= MA30 <= MA5 <= MA10
+        pattern1 = (ma30 <= ma20 <= ma10 <= ma5)
+        pattern2 = (ma30 <= ma20 <= ma5 <= ma10)
+        pattern3 = (ma20 <= ma30 <= ma10 <= ma5)
+        pattern4 = (ma20 <= ma30 <= ma5 <= ma10)
+        return pattern1 or pattern2 or pattern3 or pattern4
+    elif direction == 'short':
+        # 空头模式:MA10 <= MA5 <= MA20 <= MA30 或 MA5 <= MA10 <= MA20 <= MA30
+        # 或者:MA10 <= MA5 <= MA30 <= MA20 或 MA5 <= MA10 <= MA30 <= MA20
+        pattern1 = (ma10 <= ma5 <= ma20 <= ma30)
+        pattern2 = (ma5 <= ma10 <= ma20 <= ma30)
+        pattern3 = (ma10 <= ma5 <= ma30 <= ma20)
+        pattern4 = (ma5 <= ma10 <= ma30 <= ma20)
+        return pattern1 or pattern2 or pattern3 or pattern4
+    else:
+        return False
+
+def check_historical_ma_pattern_consistency(historical_data, direction, lookback_days, consistency_threshold):
+    """检查历史均线模式的一致性
+    
+    Args:
+        historical_data: DataFrame,包含足够天数的历史数据
+        direction: str,'long'或'short'
+        lookback_days: int,检查过去多少天
+        consistency_threshold: float,一致性阈值(0-1之间)
+    
+    Returns:
+        tuple: (bool, float) - (是否通过一致性检查, 实际一致性比例)
+    """
+    if len(historical_data) < max(g.ma_periods) + lookback_days:
+        # 历史数据不足
+        return False, 0.0
+    
+    match_count = 0
+    total_count = lookback_days
+    # log.debug(f"历史均线模式一致性检查: {direction}, 检查过去{lookback_days}天的数据")
+    # log.debug(f"历史数据: {historical_data}")
+    
+    # 检查过去lookback_days天的均线模式
+    for i in range(lookback_days):
+        # 获取倒数第(i+1)天的数据(i=0时是昨天,i=1时是前天,依此类推)
+        end_idx = -(i + 1)
+        # 获取这一天的具体日期
+        date = historical_data.index[end_idx].date()
+        # 获取到该天(包括该天)为止的所有数据
+        if i == 0:
+            data_slice = historical_data
+        else:
+            data_slice = historical_data.iloc[:-i]
+        
+        # 计算该天的均线值
+        # log.debug(f"对于倒数第{i+1}天,end_idx: {end_idx},日期: {date},计算均线值: {data_slice}")
+        ma_values = calculate_ma_values(data_slice, g.ma_periods)
+        # log.debug(f"end_idx: {end_idx},日期: {date},倒数第{i+1}天的均线值: {ma_values}")
+        
+        # 检查是否符合模式
+        if check_ma_pattern(ma_values, direction):
+            match_count += 1
+            # log.debug(f"日期: {date},对于倒数第{i+1}天,历史均线模式一致性检查: {direction} 符合模式")
+        # else:
+            # log.debug(f"日期: {date},对于倒数第{i+1}天,历史均线模式一致性检查: {direction} 不符合模式")
+    
+    consistency_ratio = match_count / total_count
+    passed = consistency_ratio >= consistency_threshold
+    
+    return passed, consistency_ratio
+
+############################ 交易执行函数 ###################################
+
+def open_position(context, security, target_hands, direction, single_hand_margin, reason=''):
+    """开仓"""
+    try:
+        # 记录交易前的可用资金
+        cash_before = context.portfolio.available_cash
+        
+        # 使用order_target按手数开仓
+        order = order_target(security, target_hands, side=direction)
+        log.debug(f"order: {order}")
+        
+        # 检查订单状态,如果为'new'说明当晚没有夜盘
+        if order is not None:
+            order_status = str(order.status).lower()
+            if order_status == 'new':
+                # 取消订单
+                cancel_order(order)
+                current_trading_day = get_current_trading_day(context.current_dt)
+                g.night_session_blocked = True
+                g.night_session_blocked_trading_day = current_trading_day
+                log.warning(f"订单状态为'new',说明{current_trading_day}当晚没有夜盘,已取消订单: {security} {direction} {target_hands}手,并禁止当晚所有操作")
+                return False
+        
+        if order is not None and order.filled > 0:
+            # 记录交易后的可用资金
+            cash_after = context.portfolio.available_cash
+            
+            # 计算实际资金变化
+            cash_change = cash_before - cash_after
+
+            # 计算保证金变化
+            margin_change = single_hand_margin * target_hands
+            
+            # 获取订单价格和数量
+            order_price = order.avg_cost if order.avg_cost else order.price
+            order_amount = order.filled
+            
+            # 记录当日交易
+            underlying_symbol = security.split('.')[0][:-4]
+            g.today_trades.append({
+                'security': security, # 合约代码
+                'underlying_symbol': underlying_symbol, # 标的代码
+                'direction': direction, # 方向
+                'order_amount': order_amount, # 订单数量
+                'order_price': order_price, # 订单价格
+                'cash_change': cash_change, # 资金变化
+                'margin_change': margin_change, # 保证金
+                'time': context.current_dt # 时间
+            })
+            
+            # 记录交易信息
+            entry_trading_day = get_current_trading_day(context.current_dt)
+            g.trade_history[security] = {
+                'entry_price': order_price,
+                'target_hands': target_hands,
+                'actual_hands': order_amount,
+                'actual_margin': margin_change,
+                'direction': direction,
+                'entry_time': context.current_dt,
+                'entry_trading_day': entry_trading_day
+            }
+
+            ma_trailing_enabled = True
+            ma_values_at_entry = calculate_realtime_ma_values(security, [5])
+            ma5_value = ma_values_at_entry.get('ma5')
+            if ma5_value is not None:
+                if direction == 'long' and order_price < ma5_value:
+                    ma_trailing_enabled = False
+                    log.info(f"禁用均线跟踪止盈: {security} {direction}, 开仓价 {order_price:.2f} < MA5 {ma5_value:.2f}")
+                elif direction == 'short' and order_price > ma5_value:
+                    ma_trailing_enabled = False
+                    log.info(f"禁用均线跟踪止盈: {security} {direction}, 开仓价 {order_price:.2f} > MA5 {ma5_value:.2f}")
+
+            g.trade_history[security]['ma_trailing_enabled'] = ma_trailing_enabled
+            
+            log.info(f"开仓成功: {security} {direction} {order_amount}手 @{order_price:.2f}, "
+                    f"保证金: {margin_change:.0f}, 资金变化: {cash_change:.0f}, 原因: {reason}")
+            
+            return True
+            
+    except Exception as e:
+        log.warning(f"开仓失败 {security}: {str(e)}")
+    
+    return False
+
+def close_position(context, security, direction):
+    """平仓"""
+    try:
+        # 使用order_target平仓到0手
+        order = order_target(security, 0, side=direction)
+        
+        if order is not None and order.filled > 0:
+            underlying_symbol = security.split('.')[0][:-4]
+            
+            # 记录当日交易(平仓)
+            g.today_trades.append({
+                'security': security,
+                'underlying_symbol': underlying_symbol,
+                'direction': direction,
+                'order_amount': -order.filled,
+                'order_price': order.avg_cost if order.avg_cost else order.price,
+                'cash_change': 0,
+                'time': context.current_dt
+            })
+            
+            log.info(f"平仓成功: {underlying_symbol} {direction} {order.filled}手")
+            
+            # 从交易历史中移除
+            if security in g.trade_history:
+                del g.trade_history[security]
+                log.debug(f"从交易历史中移除: {security}")
+        else:
+            log.info(f"平仓失败: {security} {direction} {order.filled}手")
+            return True
+            
+    except Exception as e:
+        log.warning(f"平仓失败 {security}: {str(e)}")
+    
+    return False
+
+############################ 辅助函数 ###################################
+
+def get_futures_config(underlying_symbol, config_key=None, default_value=None):
+    """获取期货品种配置信息的辅助函数"""
+    if underlying_symbol not in g.futures_config:
+        if config_key and default_value is not None:
+            return default_value
+        return {}
+    
+    if config_key is None:
+        return g.futures_config[underlying_symbol]
+    
+    return g.futures_config[underlying_symbol].get(config_key, default_value)
+
+def get_margin_rate(underlying_symbol, direction, default_rate=0.10):
+    """获取保证金比例的辅助函数"""
+    return g.futures_config.get(underlying_symbol, {}).get('margin_rate', {}).get(direction, default_rate)
+
+def get_multiplier(underlying_symbol, default_multiplier=10):
+    """获取合约乘数的辅助函数"""
+    return g.futures_config.get(underlying_symbol, {}).get('multiplier', default_multiplier)
+
+def add_to_excluded_contracts(dominant_future, reason, current_trading_day):
+    """将合约添加到排除缓存"""
+    g.excluded_contracts[dominant_future] = {
+        'reason': reason,
+        'trading_day': current_trading_day
+    }
+
+def has_reached_trading_start(current_dt, trading_start_time_str, has_night_session=False):
+    """判断当前是否已到达合约允许交易的起始时间"""
+    if not trading_start_time_str:
+        return True
+
+    try:
+        hour, minute = [int(part) for part in trading_start_time_str.split(':')[:2]]
+    except Exception:
+        return True
+
+    start_time = time(hour, minute)
+    current_time = current_dt.time()
+
+    if has_night_session:
+        if current_time >= start_time:
+            return True
+        if current_time < time(12, 0):
+            return True
+        if time(8, 30) <= current_time <= time(15, 30):
+            return True
+        return False
+
+    if current_time < start_time:
+        return False
+    if current_time >= time(20, 0):
+        return False
+    return True
+
+def calculate_target_hands(context, security, direction):
+    """计算目标开仓手数"""
+    current_price = get_current_data()[security].last_price
+    underlying_symbol = security.split('.')[0][:-4]
+    
+    # 使用保证金比例
+    margin_rate = get_margin_rate(underlying_symbol, direction)
+    multiplier = get_multiplier(underlying_symbol)
+    
+    # 计算单手保证金
+    single_hand_margin = current_price * multiplier * margin_rate
+    log.debug(f"计算单手保证金: {current_price:.2f} * {multiplier:.2f} * {margin_rate:.2f} = {single_hand_margin:.2f}")
+    
+    # 还要考虑可用资金限制
+    available_cash = context.portfolio.available_cash * g.usage_percentage
+    
+    # 根据单个标的最大持仓保证金限制计算开仓数量
+    max_margin = g.max_margin_per_position
+    
+    if single_hand_margin <= max_margin:
+        # 如果单手保证金不超过最大限制,计算最大可开仓手数
+        max_hands = int(max_margin / single_hand_margin)
+        max_hands_by_cash = int(available_cash / single_hand_margin)
+        
+        # 取两者较小值
+        actual_hands = min(max_hands, max_hands_by_cash)
+        
+        # 确保至少开1手
+        actual_hands = max(1, actual_hands)
+        
+        log.info(f"单手保证金: {single_hand_margin:.0f}, 目标开仓手数: {actual_hands}")
+        
+        return actual_hands, single_hand_margin
+    else:
+        # 如果单手保证金超过最大限制,默认开仓1手
+        actual_hands = 1
+        
+        log.info(f"单手保证金: {single_hand_margin:.0f} 超过最大限制: {max_margin}, 默认开仓1手")
+        
+        return actual_hands, single_hand_margin
+
+def check_symbol_prefix_match(symbol, context, hold_symbols):
+    """检查是否有相似的持仓品种"""
+    log.debug(f"检查持仓")
+    symbol_prefix = symbol[:-9]
+
+    long_positions = context.subportfolios[0].long_positions
+    short_positions = context.subportfolios[0].short_positions
+    log.debug(f"long_positions: {long_positions}, short_positions: {short_positions}")
+    
+    for hold_symbol in hold_symbols:
+        hold_symbol_prefix = hold_symbol[:-9] if len(hold_symbol) > 9 else hold_symbol
+        
+        if symbol_prefix == hold_symbol_prefix:
+            return True
+    return False
+
+def calculate_average_daily_change_rate(security, days=30):
+    """计算日均变化率"""
+    historical_data = attribute_history(security, days + 1, '1d', ['close'])
+    daily_change_rates = abs(historical_data['close'].pct_change()).iloc[1:]
+    return daily_change_rates.mean()
+
+def calculate_realtime_ma_values(security, ma_periods):
+    """计算包含当前价格的实时均线值"""
+    historical_data = attribute_history(security, max(ma_periods), '1d', ['close'])
+    today_price = get_current_data()[security].last_price
+    close_prices = historical_data['close'].tolist() + [today_price]
+    ma_values = {f'ma{period}': sum(close_prices[-period:]) / period for period in ma_periods}
+    return ma_values
+
+def sync_trade_history_with_positions(context):
+    """同步g.trade_history与实际持仓,清理已平仓但记录未删除的持仓"""
+    if not g.trade_history:
+        return
+    
+    subportfolio = context.subportfolios[0]
+    actual_positions = set(subportfolio.long_positions.keys()) | set(subportfolio.short_positions.keys())
+    
+    # 找出g.trade_history中有记录但实际已平仓的合约
+    stale_records = []
+    for security in g.trade_history.keys():
+        if security not in actual_positions:
+            stale_records.append(security)
+    
+    # 清理这些过期记录
+    if stale_records:
+        log.info("=" * 60)
+        log.info("发现持仓记录与实际持仓不同步,进行清理:")
+        for security in stale_records:
+            trade_info = g.trade_history[security]
+            underlying_symbol = security.split('.')[0][:-4]
+            log.info(f"  清理过期记录: {underlying_symbol}({security}) {trade_info['direction']}, "
+                    f"成本价: {trade_info['entry_price']:.2f}, "
+                    f"入场时间: {trade_info['entry_time']}")
+            del g.trade_history[security]
+        log.info(f"共清理 {len(stale_records)} 条过期持仓记录")
+        log.info("=" * 60)
+
+def after_market_close(context):
+    """收盘后运行函数"""
+    log.info(str('函数运行时间(after_market_close):'+str(context.current_dt.time())))
+    
+    # 同步检查:清理g.trade_history中已平仓但记录未删除的持仓
+    sync_trade_history_with_positions(context)
+    
+    # 清空候选列表(每天重新检查)
+    g.daily_ma_candidates = {}
+    
+    # 清空排除缓存(每天重新检查)
+    excluded_count = len(g.excluded_contracts)
+    if excluded_count > 0:
+        log.info(f"清空排除缓存,共 {excluded_count} 个合约")
+        g.excluded_contracts = {}
+    
+    # 重置夜盘禁止操作标志
+    if g.night_session_blocked:
+        log.info(f"重置夜盘禁止操作标志")
+        g.night_session_blocked = False
+        g.night_session_blocked_trading_day = None
+    
+    # 只有当天有交易时才打印统计信息
+    if g.today_trades:
+        print_daily_trading_summary(context)
+        
+        # 清空当日交易记录
+        g.today_trades = []
+    
+    log.info('##############################################################')
+
+def print_daily_trading_summary(context):
+    """打印当日交易汇总"""
+    if not g.today_trades:
+        return
+    
+    log.info("\n=== 当日交易汇总 ===")
+    total_margin = 0
+    
+    for trade in g.today_trades:
+        if trade['order_amount'] > 0:  # 开仓
+            log.info(f"开仓 {trade['underlying_symbol']} {trade['direction']} {trade['order_amount']}手 "
+                  f"价格:{trade['order_price']:.2f} 保证金:{trade['cash_change']:.0f}")
+            total_margin += trade['cash_change']
+        else:  # 平仓
+            log.info(f"平仓 {trade['underlying_symbol']} {trade['direction']} {abs(trade['order_amount'])}手 "
+                  f"价格:{trade['order_price']:.2f}")
+    
+    log.info(f"当日保证金占用: {total_margin:.0f}")
+    log.info("==================\n")
+
+########################## 自动移仓换月函数 #################################
+def position_auto_switch(context, pindex=0, switch_func=None, callback=None):
+    """期货自动移仓换月"""
+    import re
+    subportfolio = context.subportfolios[pindex]
+    symbols = set(subportfolio.long_positions.keys()) | set(subportfolio.short_positions.keys())
+    switch_result = []
+    for symbol in symbols:
+        match = re.match(r"(?P<underlying_symbol>[A-Z]{1,})", symbol)
+        if not match:
+            raise ValueError("未知期货标的: {}".format(symbol))
+        else:
+            underlying_symbol = match.groupdict()["underlying_symbol"]
+            trading_start = get_futures_config(underlying_symbol, 'trading_start_time', None)
+            has_night_session = get_futures_config(underlying_symbol, 'has_night_session', False)
+            # log.debug(f"移仓换月: {symbol}, 交易开始时间: {trading_start}, 夜盘: {has_night_session}")
+            if trading_start and not has_reached_trading_start(context.current_dt, trading_start, has_night_session):
+                # log.info("{} 当前时间 {} 未到达交易开始时间 {} (夜盘:{} ),跳过移仓".format(
+                #     symbol,
+                #     context.current_dt.strftime('%H:%M:%S'),
+                #     trading_start,
+                #     has_night_session
+                # ))
+                continue
+            dominant = get_dominant_future(underlying_symbol)
+            cur = get_current_data()
+            symbol_last_price = cur[symbol].last_price
+            dominant_last_price = cur[dominant].last_price
+            
+            if dominant > symbol:
+                for positions_ in (subportfolio.long_positions, subportfolio.short_positions):
+                    if symbol not in positions_.keys():
+                        continue
+                    else :
+                        p = positions_[symbol]
+
+                    if switch_func is not None:
+                        switch_func(context, pindex, p, dominant)
+                    else:
+                        amount = p.total_amount
+                        # 跌停不能开空和平多,涨停不能开多和平空
+                        if p.side == "long":
+                            symbol_low_limit = cur[symbol].low_limit
+                            dominant_high_limit = cur[dominant].high_limit
+                            if symbol_last_price <= symbol_low_limit:
+                                log.warning("标的{}跌停,无法平仓。移仓换月取消。".format(symbol))
+                                continue
+                            elif dominant_last_price >= dominant_high_limit:
+                                log.warning("标的{}涨停,无法开仓。移仓换月取消。".format(dominant))
+                                continue
+                            else:
+                                log.info("进行移仓换月: ({0},long) -> ({1},long)".format(symbol, dominant))
+                                order_old = order_target(symbol, 0, side='long')
+                                if order_old != None and order_old.filled > 0:
+                                    order_new = order_target(dominant, amount, side='long')
+                                    if order_new != None and order_new.filled > 0:
+                                        switch_result.append({"before": symbol, "after": dominant, "side": "long"})
+                                        # 换月成功,更新交易记录
+                                        if symbol in g.trade_history:
+                                            # 复制旧的交易记录作为基础
+                                            old_entry_price = g.trade_history[symbol]['entry_price']
+                                            g.trade_history[dominant] = g.trade_history[symbol].copy()
+
+                                            # 更新成本价为新合约的实际开仓价
+                                            new_entry_price = None
+                                            if order_new.avg_cost and order_new.avg_cost > 0:
+                                                new_entry_price = order_new.avg_cost
+                                                g.trade_history[dominant]['entry_price'] = order_new.avg_cost
+                                            elif order_new.price and order_new.price > 0:
+                                                new_entry_price = order_new.price
+                                                g.trade_history[dominant]['entry_price'] = order_new.price
+                                            else:
+                                                # 如果订单价格无效,使用当前价格作为成本价
+                                                new_entry_price = dominant_last_price
+                                                g.trade_history[dominant]['entry_price'] = dominant_last_price
+
+                                            # 更新入场时间
+                                            g.trade_history[dominant]['entry_time'] = context.current_dt
+                                            # 更新入场交易日
+                                            g.trade_history[dominant]['entry_trading_day'] = get_current_trading_day(context.current_dt)
+                                            # 删除旧合约的交易记录
+                                            del g.trade_history[symbol]
+
+                                            log.info(f"移仓换月成本价更新: {symbol} -> {dominant}, "
+                                                   f"旧成本价: {old_entry_price:.2f}, 新成本价: {new_entry_price:.2f}")
+                                    else:
+                                        log.warning("标的{}交易失败,无法开仓。移仓换月失败。".format(dominant))
+                        if p.side == "short":
+                            symbol_high_limit = cur[symbol].high_limit
+                            dominant_low_limit = cur[dominant].low_limit
+                            if symbol_last_price >= symbol_high_limit:
+                                log.warning("标的{}涨停,无法平仓。移仓换月取消。".format(symbol))
+                                continue
+                            elif dominant_last_price <= dominant_low_limit:
+                                log.warning("标的{}跌停,无法开仓。移仓换月取消。".format(dominant))
+                                continue
+                            else:
+                                log.info("进行移仓换月: ({0},short) -> ({1},short)".format(symbol, dominant))
+                                order_old = order_target(symbol, 0, side='short')
+                                if order_old != None and order_old.filled > 0:
+                                    order_new = order_target(dominant, amount, side='short')
+                                    if order_new != None and order_new.filled > 0:
+                                        switch_result.append({"before": symbol, "after": dominant, "side": "short"})
+                                        # 换月成功,更新交易记录
+                                        if symbol in g.trade_history:
+                                            # 复制旧的交易记录作为基础
+                                            old_entry_price = g.trade_history[symbol]['entry_price']
+                                            g.trade_history[dominant] = g.trade_history[symbol].copy()
+
+                                            # 更新成本价为新合约的实际开仓价
+                                            new_entry_price = None
+                                            if order_new.avg_cost and order_new.avg_cost > 0:
+                                                new_entry_price = order_new.avg_cost
+                                                g.trade_history[dominant]['entry_price'] = order_new.avg_cost
+                                            elif order_new.price and order_new.price > 0:
+                                                new_entry_price = order_new.price
+                                                g.trade_history[dominant]['entry_price'] = order_new.price
+                                            else:
+                                                # 如果订单价格无效,使用当前价格作为成本价
+                                                new_entry_price = dominant_last_price
+                                                g.trade_history[dominant]['entry_price'] = dominant_last_price
+
+                                            # 更新入场时间
+                                            g.trade_history[dominant]['entry_time'] = context.current_dt
+                                            # 更新入场交易日
+                                            g.trade_history[dominant]['entry_trading_day'] = get_current_trading_day(context.current_dt)
+                                            # 删除旧合约的交易记录
+                                            del g.trade_history[symbol]
+
+                                            log.info(f"移仓换月成本价更新: {symbol} -> {dominant}, "
+                                                   f"旧成本价: {old_entry_price:.2f}, 新成本价: {new_entry_price:.2f}")
+                                    else:
+                                        log.warning("标的{}交易失败,无法开仓。移仓换月失败。".format(dominant))
+                        if callback:
+                            callback(context, pindex, p, dominant)
+    return switch_result
+

+ 121 - 0
Lib/future/README_records_analysis.md

@@ -0,0 +1,121 @@
+# 期货开仓记录分析工具
+
+## 概览
+
+该工具基于 `Lib/future/records.csv` 的期货开仓记录,输出以 **平均盈亏** 与 **平均保证金收益率** 为核心的多维度分析结果。脚本聚焦以下能力:
+
+- 自动清洗与增强原始数据:提取品种代码、识别开盘后时间段、计算保证金收益率、穿越均线数量、成交额分组等 15+ 个衍生字段。
+- 提供单一文件内的全套统计汇总、CSV 导出和可视化图表,方便快速洞察交易表现。
+- 所有结果默认输出至 `Lib/future/analysis_results/` 目录,便于后续复盘或回测。
+
+## 核心功能
+
+### 数据预处理
+- 标的字段自动解析出品种代码(如“原油2002(SC2002.XINE)”→“SC”),提取成功率 100%。
+- 结合 `MAPatternStrategy_v002.py` 中的交易时间,计算 `<30分钟 / 30-60分钟 / >1小时` 三段开仓延迟区间。
+- 构建保证金收益率(交易盈亏 ÷ 保证金 × 100)、交易时段(夜盘/上午/下午)、穿越均线数量、成交额分组等指标。
+
+### 多维度统计
+- **均线组合**:统计出现次数、平均盈亏、平均保证金收益率、胜率与盈亏比,可快速筛选表现最稳健的组合。
+- **开盘后时间段**:对比不同开仓延迟区间的平均盈亏与保证金收益率,直观判断开仓时机的重要性。
+- **均线 × 时间段交叉**:同时查看胜率、平均盈亏、平均保证金收益率三张热力图,定位最优战术组合。
+- **交易类型 / 品种 / 品种代码**:支持从“开多/开空”“商品/金融”“Top 品种代码”多层级对比收益表现。
+- **其他维度**:控制台输出仍覆盖成交额分组、交易时段、穿越均线数量、多空对比、夜盘属性与组合策略排名等信息。
+
+### 可视化输出(PNG)
+
+1. `ma_lines_analysis.png`  
+   - 4 个子图:出现次数、胜率、平均盈亏、平均保证金收益率,全部带数值标注,重点突出收益类指标。
+2. `time_segment_analysis.png`  
+   - 展示三个开盘时间段的平均盈亏与平均保证金收益率对比(双柱状图)。
+3. `cross_analysis_heatmap.png`  
+   - 3 张热力图:胜率、平均盈亏、平均保证金收益率(均线组合 × 开盘时间段)。
+4. `variety_analysis.png`  
+   - 交易类型与品种类型的胜率/收益率对比,快速锁定得分最高的资产类别。
+
+> 说明:旧的 `additional_analysis.png` 已移除,所有图表均以平均盈亏与保证金收益率为主视角。
+
+### CSV 输出
+
+| 文件 | 说明 |
+| --- | --- |
+| `records_enhanced.csv` | 增强后的明细数据,包含所有衍生字段 |
+| `ma_lines_stats.csv` | 各均线组合的综合统计 |
+| `time_segment_stats.csv` | 三段开盘时间的统计 |
+| `symbol_stats.csv` | 各品种代码的统计 |
+| `combo_strategy_stats.csv` | “均线组合×时间段×方向” 组合,样本量≥5 |
+
+## 使用指南
+
+### 1. 安装依赖
+
+```bash
+pip install pandas numpy matplotlib seaborn
+```
+
+### 2. 运行脚本
+
+```bash
+cd Lib/future
+python records_analysis.py
+```
+
+### 3. 输出内容
+
+- 控制台:打印各维度汇总表,含样本量、平均盈亏、平均保证金收益率等。
+- `analysis_results/`:自动生成 PNG 图表与 CSV 文件。
+
+## 指标解释
+
+| 指标 | 定义 |
+| --- | --- |
+| 出现次数 | 分组内的交易笔数 |
+| 胜率 | 盈利笔数 / 总笔数(`交易盈亏 > 0` 为盈利) |
+| 平均盈亏 | 交易盈亏字段的均值(元) |
+| 盈亏比 | 平均盈利金额 / 平均亏损金额(绝对值) |
+| 平均保证金收益率 | 单笔收益率(盈亏/保证金×100)的平均值 |
+
+时间段计算:用品种对应的 `trading_start_time` 与委托时间求差,自动处理夜盘跨日情况;交易时段依次划分为夜盘(21:00-03:00)、上午(09:00-12:00)与下午(12:00-16:00)。
+
+## 数据洞察速览(基于 1061 条样本)
+
+- **开仓时机**:<30分钟平均盈亏 11,256 元、保证金收益率 5.73%,显著优于 >1小时段(1,073 元 / 4.17%)。
+- **均线组合**:`MA20;MA30` 在平均盈亏与保证金收益率上均表现亮眼(6,160 元 / 10.36%)。
+- **交易类型**:开多的平均保证金收益率 5.38%,高于开空的 3.20%。
+- **品种类型**:金融期货的平均保证金收益率 7.68%,显著优于商品期货的 3.42%。
+- **组合策略**:`MA10;MA20 + >1小时 + 开空` 的平均保证金收益率 21.25%,位列历史最佳组合之一。
+
+> 以上数字会随最新数据自动更新,可直接参考控制台及 CSV 输出。
+
+## 文件结构
+
+```
+Lib/future/
+├── records.csv                 # 原始交易记录
+├── records_analysis.py         # 主分析脚本
+├── README_records_analysis.md  # 本文档
+└── analysis_results/           # 输出目录
+    ├── *.png                   # 4 张可视化图表
+    ├── records_enhanced.csv    # 增强后的明细
+    ├── ma_lines_stats.csv
+    ├── time_segment_stats.csv
+    ├── symbol_stats.csv
+    └── combo_strategy_stats.csv
+```
+
+## 扩展建议
+
+- 添加更多维度:如持仓时长、波动率、止损/止盈比例等指标。
+- 引入机器学习:基于历史组合表现搭建简单的策略筛选模型。
+- 交互分析:将 CSV 导入 Notebook、BI 工具或 Plotly Dash 进行动态探索。
+
+## 技术栈
+
+- Python 3.x
+- pandas / numpy:数据处理与统计
+- matplotlib / seaborn:绘图与热力图
+
+---
+
+如需调整统计口径或新增图表,可直接修改 `records_analysis.py`,所有逻辑均已模块化拆分,便于快速扩展。
+

二進制
Lib/future/__pycache__/transaction_pair_analysis.cpython-311.pyc


+ 362 - 109
Lib/future/kline_reconstruction.py

@@ -8,7 +8,9 @@ import matplotlib.pyplot as plt
 import matplotlib.patches as patches
 from datetime import datetime, timedelta, date
 import re
+from tqdm import tqdm
 import os
+import zipfile
 import warnings
 warnings.filterwarnings('ignore')
 
@@ -31,13 +33,13 @@ def _get_current_directory():
         # 在 Jupyter notebook 环境中,__file__ 不存在,使用当前工作目录
         current_dir = os.getcwd()
         # 如果当前目录不是 future 目录,尝试查找
-        if not os.path.exists(os.path.join(current_dir, 'transaction.csv')):
+        if not os.path.exists(os.path.join(current_dir, 'transaction1.csv')):
             # 尝试查找 future 目录
             if 'future' not in current_dir:
                 # 尝试向上查找 future 目录
                 parent_dir = os.path.dirname(current_dir)
                 future_dir = os.path.join(parent_dir, 'future')
-                if os.path.exists(os.path.join(future_dir, 'transaction.csv')):
+                if os.path.exists(os.path.join(future_dir, 'transaction1.csv')):
                     current_dir = future_dir
     return current_dir
 
@@ -52,30 +54,45 @@ def read_and_filter_open_positions(csv_path):
     返回:
         pandas.DataFrame: 包含开仓记录的DataFrame
     """
-    try:
-        df = pd.read_csv(csv_path, encoding='utf-8-sig')
-        
-        # 筛选交易类型第一个字符为"开"的行
-        open_positions = df[df['交易类型'].str[0] == '开'].copy()
-        
-        print(f"从CSV文件中读取到 {len(df)} 条记录")
-        print(f"筛选出 {len(open_positions)} 条开仓记录")
-        
-        return open_positions
-    except Exception as e:
-        print(f"读取CSV文件时出错: {str(e)}")
-        return pd.DataFrame()
+    # 尝试多种编码格式
+    encodings = ['utf-8-sig', 'utf-8', 'gbk', 'gb2312', 'gb18030', 'latin1']
+    
+    for encoding in encodings:
+        try:
+            df = pd.read_csv(csv_path, encoding=encoding)
+            
+            # 筛选交易类型第一个字符为"开"的行
+            open_positions = df[df['交易类型'].str[0] == '开'].copy()
+            
+            print(f"成功使用 {encoding} 编码读取CSV文件")
+            print(f"从CSV文件中读取到 {len(df)} 条记录")
+            print(f"筛选出 {len(open_positions)} 条开仓记录")
+            
+            return open_positions
+        except UnicodeDecodeError:
+            continue
+        except Exception as e:
+            # 如果是其他错误(比如列名不存在),也尝试下一种编码
+            if encoding == encodings[-1]:
+                # 最后一种编码也失败了,抛出错误
+                print(f"读取CSV文件时出错: {str(e)}")
+                raise
+            continue
+    
+    # 所有编码都失败了
+    print(f"无法使用任何编码格式读取CSV文件: {csv_path}")
+    return pd.DataFrame()
 
 
 def extract_contract_code_and_date(row):
     """
-    从标的列提取合约编号,从日期列提取日期
+    从标的列提取合约编号,从日期列提取日期,从委托时间计算实际交易日,从成交价列提取成交价,从交易类型提取开仓方向
     
     参数:
         row (pandas.Series): DataFrame的一行数据
     
     返回:
-        tuple: (contract_code, trade_date) 或 (None, None) 如果提取失败
+        tuple: (contract_code, actual_trade_date, trade_price, direction, order_time) 或 (None, None, None, None, None) 如果提取失败
     """
     try:
         # 提取合约编号:从"标的"列中提取括号内的内容
@@ -85,29 +102,122 @@ def extract_contract_code_and_date(row):
             contract_code = match.group(1)
         else:
             print(f"无法从标的 '{target_str}' 中提取合约编号")
-            return None, None
+            return None, None, None
+        
+        # 提取日期,支持多种日期格式
+        date_str = str(row['日期']).strip()
+        
+        # 尝试多种日期格式
+        date_formats = [
+            '%Y-%m-%d',      # 2025-01-02
+            '%d/%m/%Y',      # 14/10/2025
+            '%Y/%m/%d',      # 2025/01/02
+            '%d-%m-%Y',      # 14-10-2025
+            '%Y%m%d',        # 20250102
+        ]
+        
+        base_date = None
+        for date_format in date_formats:
+            try:
+                base_date = datetime.strptime(date_str, date_format).date()
+                break
+            except ValueError:
+                continue
         
-        # 提取日期
-        date_str = str(row['日期'])
+        if base_date is None:
+            print(f"日期格式错误: {date_str} (支持的格式: YYYY-MM-DD, DD/MM/YYYY, YYYY/MM/DD, DD-MM-YYYY, YYYYMMDD)")
+            return None, None, None
+        
+        # 提取委托时间,判断是否是晚上(>=21:00)
+        order_time_str = str(row['委托时间']).strip()
         try:
-            trade_date = datetime.strptime(date_str, '%Y-%m-%d').date()
-        except:
-            print(f"日期格式错误: {date_str}")
-            return None, None
+            # 解析时间格式 HH:MM:SS 或 HH:MM
+            time_parts = order_time_str.split(':')
+            hour = int(time_parts[0])
+            
+            # 如果委托时间 >= 21:00,需要找到下一个交易日
+            if hour >= 21:
+                # 使用get_trade_days获取从base_date开始的交易日
+                # count=2 表示获取包括base_date在内的2个交易日
+                # 如果base_date是交易日,则返回[base_date, next_trade_day]
+                # 如果base_date不是交易日,则返回下一个交易日和再下一个交易日
+                try:
+                    trade_days = get_trade_days(start_date=base_date, count=2)
+                    # print(f"trade_days: {trade_days}")
+                    if len(trade_days) >= 2:
+                        # 取第二个交易日(索引1)作为实际交易日
+                        next_trade_day = trade_days[1]
+                        if isinstance(next_trade_day, datetime):
+                            actual_trade_date = next_trade_day.date()
+                        elif isinstance(next_trade_day, date):
+                            actual_trade_date = next_trade_day
+                        else:
+                            # 如果类型不对,尝试转换
+                            actual_trade_date = base_date
+                            print(f"警告:获取的交易日类型异常: {type(next_trade_day)}")
+                    elif len(trade_days) == 1:
+                        # 如果只返回1个交易日,说明base_date就是交易日,但已经是最后一个交易日
+                        # 这种情况应该取下一个交易日,但可能超出了数据范围
+                        first_day = trade_days[0]
+                        if isinstance(first_day, datetime):
+                            actual_trade_date = first_day.date()
+                        elif isinstance(first_day, date):
+                            actual_trade_date = first_day
+                        else:
+                            actual_trade_date = base_date
+                        print(f"警告:只获取到1个交易日,可能已到数据边界")
+                    else:
+                        # 如果获取失败,使用base_date
+                        actual_trade_date = base_date
+                        print(f"警告:无法获取下一个交易日,使用原始日期")
+                except Exception as e:
+                    # 如果获取交易日失败,使用base_date
+                    actual_trade_date = base_date
+                    print(f"获取交易日时出错: {str(e)},使用原始日期")
+            else:
+                # 委托时间 < 21:00,使用原始日期
+                actual_trade_date = base_date
+        except Exception as e:
+            # 如果解析时间失败,使用原始日期
+            print(f"解析委托时间失败: {order_time_str}, 使用原始日期")
+            actual_trade_date = base_date
+        
+        # print(f"成交日期:{date_str},委托时间:{order_time_str},实际交易日:{actual_trade_date}")
         
-        return contract_code, trade_date
+        # 提取成交价
+        try:
+            trade_price = float(row['成交价'])
+        except (ValueError, KeyError):
+            print(f"无法提取成交价: {row.get('成交价', 'N/A')}")
+            return None, None, None, None, None
+        
+        # 提取开仓方向:从"交易类型"列提取,开多是'long',开空是'short'
+        try:
+            trade_type = str(row['交易类型']).strip()
+            if '开多' in trade_type or '多' in trade_type:
+                direction = 'long'
+            elif '开空' in trade_type or '空' in trade_type:
+                direction = 'short'
+            else:
+                print(f"无法识别交易方向: {trade_type}")
+                direction = 'unknown'
+        except (KeyError, ValueError):
+            print(f"无法提取交易类型")
+            direction = 'unknown'
+        
+        return contract_code, actual_trade_date, trade_price, direction, order_time_str
     except Exception as e:
         print(f"提取合约编号和日期时出错: {str(e)}")
-        return None, None
+        return None, None, None, None, None
 
 
-def calculate_trade_days_range(trade_date, days_before=60, days_after=10):
+def calculate_trade_days_range(trade_date, days_before=100, days_after=10):
     """
     计算交易日范围:往前days_before个交易日,往后days_after个交易日
     
     参数:
         trade_date (date): 开仓日期
-        days_before (int): 往前交易日数量,默认60
+        days_before (int): 往前交易日数量,默认100
         days_after (int): 往后交易日数量,默认10
     
     返回:
@@ -122,7 +232,14 @@ def calculate_trade_days_range(trade_date, days_before=60, days_after=10):
             print(f"无法获取足够的往前交易日,只获取到 {len(trade_days_before)} 个")
             return None, None
         
-        start_date = trade_days_before[0].date()
+        # 处理返回的日期对象:可能是date或datetime类型
+        first_day = trade_days_before[0]
+        if isinstance(first_day, datetime):
+            start_date = first_day.date()
+        elif isinstance(first_day, date):
+            start_date = first_day
+        else:
+            start_date = first_day
         
         # 往后找:从trade_date往后找days_after个交易日
         # get_trade_days(start_date=trade_date, count=n) 返回包括trade_date在内的n个交易日
@@ -132,7 +249,14 @@ def calculate_trade_days_range(trade_date, days_before=60, days_after=10):
             print(f"无法获取足够的往后交易日,只获取到 {len(trade_days_after)} 个")
             return None, None
         
-        end_date = trade_days_after[-1].date()
+        # 处理返回的日期对象:可能是date或datetime类型
+        last_day = trade_days_after[-1]
+        if isinstance(last_day, datetime):
+            end_date = last_day.date()
+        elif isinstance(last_day, date):
+            end_date = last_day
+        else:
+            end_date = last_day
         
         return start_date, end_date
     except Exception as e:
@@ -209,14 +333,17 @@ def filter_data_with_ma(data):
     return filtered_data
 
 
-def plot_kline_chart(data, contract_code, trade_date, save_path):
+def plot_kline_chart(data, contract_code, trade_date, trade_price, direction, order_time, save_path):
     """
-    绘制K线图(包含均线和开仓日期标注)
+    绘制K线图(包含均线和开仓日期、成交价、方向、委托时间标注)
     
     参数:
         data (pandas.DataFrame): 包含OHLC和均线数据的DataFrame
         contract_code (str): 合约编号
-        trade_date (date): 开仓日期
+        trade_date (date): 实际交易日
+        trade_price (float): 成交价
+        direction (str): 开仓方向,'long'或'short'
+        order_time (str): 委托时间
         save_path (str): 保存路径
     """
     try:
@@ -230,18 +357,91 @@ def plot_kline_chart(data, contract_code, trade_date, save_path):
         lows = data['low']
         closes = data['close']
         
+        # 调试:打印数据结构信息(仅第一次调用时打印)
+        if not hasattr(plot_kline_chart, '_debug_printed'):
+            print(f"\n=== K线数据索引类型调试信息 ===")
+            print(f"索引类型: {type(dates)}")
+            print(f"索引数据类型: {type(dates[0]) if len(dates) > 0 else 'N/A'}")
+            print(f"前3个索引值: {[dates[i] for i in range(min(3, len(dates)))]}")
+            print(f"索引是否为DatetimeIndex: {isinstance(dates, pd.DatetimeIndex)}")
+            print(f"================================\n")
+            plot_kline_chart._debug_printed = True
+        
+        # 统一转换为date类型进行比较
+        trade_date_normalized = trade_date
+        if isinstance(trade_date, datetime):
+            trade_date_normalized = trade_date.date()
+        elif isinstance(trade_date, pd.Timestamp):
+            trade_date_normalized = trade_date.date()
+        elif not isinstance(trade_date, date):
+            try:
+                trade_date_normalized = pd.to_datetime(trade_date).date()
+            except:
+                pass
+        
         # 找到开仓日期在数据中的位置
         trade_date_idx = None
-        for i, date_idx in enumerate(dates):
-            if isinstance(date_idx, date):
-                if date_idx == trade_date:
-                    trade_date_idx = i
-                    break
-            elif isinstance(date_idx, datetime):
-                if date_idx.date() == trade_date:
+        
+        # 如果索引是DatetimeIndex,直接使用date比较
+        if isinstance(dates, pd.DatetimeIndex):
+            # 将DatetimeIndex转换为date进行比较
+            try:
+                # 使用normalize()将时间部分去掉,然后比较date
+                trade_date_normalized_dt = pd.Timestamp(trade_date_normalized)
+                # 查找匹配的日期
+                mask = dates.normalize() == trade_date_normalized_dt
+                if mask.any():
+                    trade_date_idx = mask.argmax()
+            except Exception as e:
+                print(f"使用DatetimeIndex匹配时出错: {e}")
+        
+        # 如果还没找到,使用循环方式查找
+        if trade_date_idx is None:
+            for i, date_idx in enumerate(dates):
+                date_to_compare = None
+                # 处理pandas Timestamp类型
+                if isinstance(date_idx, pd.Timestamp):
+                    date_to_compare = date_idx.date()
+                elif isinstance(date_idx, datetime):
+                    date_to_compare = date_idx.date()
+                elif isinstance(date_idx, date):
+                    date_to_compare = date_idx
+                elif hasattr(date_idx, 'date'):
+                    try:
+                        date_to_compare = date_idx.date()
+                    except:
+                        pass
+                
+                # 比较日期
+                if date_to_compare is not None and date_to_compare == trade_date_normalized:
                     trade_date_idx = i
                     break
         
+        # 如果还是没找到,尝试查找最接近的日期(前后各找1天)
+        if trade_date_idx is None:
+            print(f"警告:未找到精确匹配的交易日 {trade_date_normalized}")
+            print(f"  尝试查找前后1天的日期...")
+            for offset in [-1, 1]:
+                try:
+                    target_date = trade_date_normalized + timedelta(days=offset)
+                    for i, date_idx in enumerate(dates):
+                        date_to_compare = None
+                        if isinstance(date_idx, pd.Timestamp):
+                            date_to_compare = date_idx.date()
+                        elif isinstance(date_idx, datetime):
+                            date_to_compare = date_idx.date()
+                        elif isinstance(date_idx, date):
+                            date_to_compare = date_idx
+                        
+                        if date_to_compare == target_date:
+                            trade_date_idx = i
+                            print(f"  找到最接近的日期 {target_date} (偏移{offset}天) 在索引 {i}")
+                            break
+                    if trade_date_idx is not None:
+                        break
+                except:
+                    pass
+        
         # 绘制K线
         for i in range(len(data)):
             date_idx = dates[i]
@@ -275,15 +475,40 @@ def plot_kline_chart(data, contract_code, trade_date, save_path):
         ax.plot(range(len(data)), data['ma20'], label='MA20', color='purple', linewidth=1.5, alpha=0.8)
         ax.plot(range(len(data)), data['ma30'], label='MA30', color='brown', linewidth=1.5, alpha=0.8)
         
-        # 标注开仓日期位置
+        # 标注开仓日期位置和成交价
         if trade_date_idx is not None:
-            trade_price = closes.iloc[trade_date_idx]
-            ax.plot(trade_date_idx, trade_price, marker='*', markersize=15, 
-                   color='yellow', markeredgecolor='black', markeredgewidth=1.5,
+            # 绘制标记点(使用成交价)
+            ax.plot(trade_date_idx, trade_price, marker='*', markersize=20, 
+                   color='yellow', markeredgecolor='black', markeredgewidth=2,
                    label='Open Position', zorder=10)
             # 添加垂直线
             ax.axvline(x=trade_date_idx, color='yellow', linestyle='--', 
                       linewidth=2, alpha=0.7, zorder=5)
+            
+            # 标注日期、成交价、方向和委托时间文本
+            date_label = trade_date.strftime('%Y-%m-%d')
+            price_label = f'Price: {trade_price:.2f}'
+            direction_label = f'Direction: {direction}'
+            time_label = f'Time: {order_time}'
+            
+            # 计算文本位置(在标记点上方,确保可见)
+            price_range = highs.max() - lows.min()
+            y_offset = max(price_range * 0.08, (highs.max() - trade_price) * 0.3)  # 至少8%的价格范围,或30%的上方空间
+            text_y = trade_price + y_offset
+            
+            # 如果文本位置超出图表范围,放在标记点下方
+            if text_y > highs.max():
+                text_y = trade_price - price_range * 0.08
+            
+            # 添加文本标注(包含所有信息)
+            annotation_text = f'{date_label}\n{price_label}\n{direction_label}\n{time_label}'
+            ax.text(trade_date_idx, text_y, annotation_text, 
+                   fontsize=10, ha='center', va='bottom',
+                   bbox=dict(boxstyle='round,pad=0.6', facecolor='yellow', alpha=0.9, edgecolor='black', linewidth=1.5),
+                   zorder=11, weight='bold')
+        else:
+            # 即使没找到精确日期,也尝试标注(使用最接近的日期)
+            print(f"警告:交易日 {trade_date} 不在K线数据范围内,无法标注")
         
         # 设置图表标题和标签(使用英文)
         contract_simple = contract_code.split('.')[0]  # 提取合约编号的简约部分
@@ -295,7 +520,7 @@ def plot_kline_chart(data, contract_code, trade_date, save_path):
         ax.set_xlabel('Time', fontsize=12)
         ax.set_ylabel('Price', fontsize=12)
         ax.grid(True, alpha=0.3)
-        ax.legend(loc='upper left', fontsize=10)
+        ax.legend(loc='lower left', fontsize=10)
         
         # 设置x轴标签
         step = max(1, len(data) // 10)  # 显示约10个时间标签
@@ -332,63 +557,89 @@ def plot_kline_chart(data, contract_code, trade_date, save_path):
         # 调整布局并保存
         plt.tight_layout()
         plt.savefig(save_path, dpi=150, bbox_inches='tight')
-        plt.close()
-        
-        print(f"K线图已保存到: {save_path}")
+        # print(f"K线图已保存到: {save_path}")
+        plt.close(fig)
         
     except Exception as e:
         print(f"绘制K线图时出错: {str(e)}")
+        # 确保即使出错也关闭图形
+        try:
+            plt.close('all')
+        except:
+            pass
         raise
 
 
-def reconstruct_kline_from_transactions(csv_path=None, output_dir=None):
+def create_zip_archive(directory_path, zip_filename=None):
+    """
+    将指定目录打包成zip文件
+    
+    参数:
+        directory_path (str): 要打包的目录路径
+        zip_filename (str): zip文件名,如果为None则自动生成
+    
+    返回:
+        str: zip文件路径
+    """
+    if not os.path.exists(directory_path):
+        print(f"目录不存在: {directory_path}")
+        return None
+    
+    if zip_filename is None:
+        # 自动生成zip文件名
+        timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
+        dir_name = os.path.basename(os.path.normpath(directory_path))
+        zip_filename = f"{dir_name}_{timestamp}.zip"
+        # 保存在目录的父目录中
+        zip_path = os.path.join(os.path.dirname(directory_path), zip_filename)
+    else:
+        zip_path = zip_filename
+    
+    try:
+        print(f"\n=== 开始打包目录: {directory_path} ===")
+        file_count = 0
+        with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
+            # 遍历目录中的所有文件
+            for root, dirs, files in os.walk(directory_path):
+                for file in files:
+                    file_path = os.path.join(root, file)
+                    # 计算相对路径:相对于要打包的目录
+                    arcname = os.path.relpath(file_path, directory_path)
+                    zipf.write(file_path, arcname)
+                    file_count += 1
+        
+        # 获取zip文件大小
+        zip_size = os.path.getsize(zip_path) / (1024 * 1024)  # MB
+        print(f"✓ 打包完成: {zip_path}")
+        print(f"  包含文件数: {file_count} 个")
+        print(f"  文件大小: {zip_size:.2f} MB")
+        
+        return zip_path
+    except Exception as e:
+        print(f"✗ 打包时出错: {str(e)}")
+        return None
+
+
+def reconstruct_kline_from_transactions(csv_filename=None, output_dir=None):
     """
     主函数:从交易记录中复原K线图
     
     参数:
-        csv_path (str): CSV文件路径,默认为 'Lib/future/transaction.csv'
-        output_dir (str): 输出目录,默认为 'Lib/future/K'
+        csv_filename (str): CSV文件名,如果为None则需要在代码中设置文件名
+        output_dir (str): 输出目录,如果为None则自动设置为当前目录的K子目录
     """
-    # 设置默认路径
-    if csv_path is None:
-        # 获取当前文件所在目录
-        # 在 Jupyter notebook 中,__file__ 不存在,使用当前工作目录
-        try:
-            current_dir = os.path.dirname(os.path.abspath(__file__))
-        except NameError:
-            # 在 Jupyter notebook 环境中,使用当前工作目录
-            current_dir = os.getcwd()
-            # 如果当前目录不是 future 目录,尝试查找
-            if not os.path.exists(os.path.join(current_dir, 'transaction.csv')):
-                # 尝试查找 future 目录
-                if 'future' in current_dir:
-                    pass  # 已经在 future 目录中
-                else:
-                    # 尝试向上查找 future 目录
-                    parent_dir = os.path.dirname(current_dir)
-                    future_dir = os.path.join(parent_dir, 'future')
-                    if os.path.exists(os.path.join(future_dir, 'transaction.csv')):
-                        current_dir = future_dir
-        csv_path = os.path.join(current_dir, 'transaction.csv')
+    # ========== 路径配置:只需在这里设置CSV文件名 ==========
+    if csv_filename is None:
+        # 设置CSV文件名(只需修改文件名,不需要完整路径)
+        csv_filename = 'transaction4.csv'
+    # ====================================================
+    
+    # 获取当前目录并拼接CSV文件路径
+    current_dir = _get_current_directory()
+    csv_path = os.path.join(current_dir, csv_filename)
     
+    # 自动设置输出目录
     if output_dir is None:
-        # 获取当前文件所在目录
-        try:
-            current_dir = os.path.dirname(os.path.abspath(__file__))
-        except NameError:
-            # 在 Jupyter notebook 环境中,使用当前工作目录
-            current_dir = os.getcwd()
-            # 如果当前目录不是 future 目录,尝试查找
-            if not os.path.exists(os.path.join(current_dir, 'transaction.csv')):
-                # 尝试查找 future 目录
-                if 'future' in current_dir:
-                    pass  # 已经在 future 目录中
-                else:
-                    # 尝试向上查找 future 目录
-                    parent_dir = os.path.dirname(current_dir)
-                    future_dir = os.path.join(parent_dir, 'future')
-                    if os.path.exists(os.path.join(future_dir, 'transaction.csv')):
-                        current_dir = future_dir
         output_dir = os.path.join(current_dir, 'K')
     
     # 确保输出目录存在
@@ -408,28 +659,24 @@ def reconstruct_kline_from_transactions(csv_path=None, output_dir=None):
     success_count = 0
     fail_count = 0
     
-    for idx, row in open_positions.iterrows():
-        print(f"\n--- 处理第 {idx + 1}/{len(open_positions)} 条记录 ---")
+    for idx, row in tqdm(open_positions.iterrows(), total=len(open_positions), desc="处理开仓记录"):
+        # print(f"\n--- 处理第 {idx + 1}/{len(open_positions)} 条记录 ---")
         
         try:
-            # 提取合约编号和日期
-            contract_code, trade_date = extract_contract_code_and_date(row)
-            if contract_code is None or trade_date is None:
-                print(f"跳过:无法提取合约编号或日期")
+            # 提取合约编号、实际交易日、成交价、开仓方向和委托时间
+            contract_code, actual_trade_date, trade_price, direction, order_time = extract_contract_code_and_date(row)
+            if contract_code is None or actual_trade_date is None or trade_price is None or direction is None or order_time is None:
+                print(f"跳过:无法提取完整信息(合约编号、日期、成交价、方向或委托时间)")
                 fail_count += 1
                 continue
             
-            print(f"合约编号: {contract_code}, 开仓日期: {trade_date}")
-            
             # 计算交易日范围
-            start_date, end_date = calculate_trade_days_range(trade_date, days_before=60, days_after=10)
+            start_date, end_date = calculate_trade_days_range(actual_trade_date, days_before=100, days_after=10)
             if start_date is None or end_date is None:
                 print(f"跳过:无法计算交易日范围")
                 fail_count += 1
                 continue
             
-            print(f"数据范围: {start_date} 至 {end_date}")
-            
             # 获取K线数据
             kline_data = get_kline_data(contract_code, start_date, end_date)
             if kline_data is None or len(kline_data) == 0:
@@ -437,8 +684,6 @@ def reconstruct_kline_from_transactions(csv_path=None, output_dir=None):
                 fail_count += 1
                 continue
             
-            print(f"获取到 {len(kline_data)} 条K线数据")
-            
             # 计算均线
             kline_data = calculate_moving_averages(kline_data)
             
@@ -449,18 +694,15 @@ def reconstruct_kline_from_transactions(csv_path=None, output_dir=None):
                 fail_count += 1
                 continue
             
-            print(f"过滤后剩余 {len(filtered_data)} 条有效数据")
-            
             # 生成文件名
             contract_simple = contract_code.split('.')[0]  # 提取合约编号的简约部分
-            filename = f"{contract_simple}_{trade_date.strftime('%Y%m%d')}.png"
+            filename = f"{contract_simple}_{actual_trade_date.strftime('%Y%m%d')}.png"
             save_path = os.path.join(output_dir, filename)
             
-            # 绘制K线图
-            plot_kline_chart(filtered_data, contract_code, trade_date, save_path)
+            # 绘制K线图(传入实际交易日和成交价)
+            plot_kline_chart(filtered_data, contract_code, actual_trade_date, trade_price, direction, order_time, save_path)
             
             success_count += 1
-            print(f"✓ 成功处理")
             
         except Exception as e:
             print(f"✗ 处理时出错: {str(e)}")
@@ -471,7 +713,18 @@ def reconstruct_kline_from_transactions(csv_path=None, output_dir=None):
     print(f"\n=== 处理完成 ===")
     print(f"成功: {success_count} 条")
     print(f"失败: {fail_count} 条")
-    print(f"总计: {len(open_positions)} 条")
+    print(f"总计: {success_count + fail_count} 条")
+    
+    # 3. 打包图片目录
+    if success_count > 0:
+        print(f"\n=== 步骤3: 打包图片目录 ===")
+        zip_path = create_zip_archive(output_dir)
+        if zip_path:
+            print(f"✓ 打包文件已保存: {zip_path}")
+        else:
+            print(f"✗ 打包失败")
+    else:
+        print(f"\n未生成任何图片,跳过打包步骤")
 
 
 # 使用示例

+ 1151 - 0
Lib/future/records_analysis.py

@@ -0,0 +1,1151 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+"""
+期货开仓记录分析工具
+分析 records.csv 中的期货交易数据,提供多维度统计分析
+"""
+
+import pandas as pd
+import numpy as np
+import matplotlib.pyplot as plt
+import seaborn as sns
+from datetime import datetime, timedelta
+import re
+import os
+import sys
+
+# 设置中文字体
+plt.rcParams['font.sans-serif'] = ['Arial Unicode MS', 'SimHei', 'DejaVu Sans']
+plt.rcParams['axes.unicode_minus'] = False
+
+# 期货配置字典(从MAPatternStrategy_v002.py复制)
+FUTURES_CONFIG = {
+    # 贵金属
+    'AU': {'has_night_session': True, 'margin_rate': {'long': 0.21, 'short': 0.21}, 'multiplier': 1000, 'trading_start_time': '21:00'},
+    'AG': {'has_night_session': True, 'margin_rate': {'long': 0.22, 'short': 0.22}, 'multiplier': 15, 'trading_start_time': '21:00'},
+    
+    # 有色金属
+    'CU': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'AL': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'ZN': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'PB': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'NI': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 1, 'trading_start_time': '21:00'},
+    'SN': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 1, 'trading_start_time': '21:00'},
+    'SS': {'has_night_session': True, 'margin_rate': {'long': 0.07, 'short': 0.07}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    
+    # 黑色系
+    'RB': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'HC': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'I': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 100, 'trading_start_time': '21:00'},
+    'JM': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 100, 'trading_start_time': '21:00'},
+    'J': {'has_night_session': True, 'margin_rate': {'long': 0.25, 'short': 0.25}, 'multiplier': 60, 'trading_start_time': '21:00'},
+    
+    # 能源化工
+    'SP': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'FU': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'BU': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'RU': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'BR': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'SC': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 1000, 'trading_start_time': '21:00'},
+    'NR': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'LU': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'LC': {'has_night_session': False, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 1, 'trading_start_time': '09:00'},
+    
+    # 化工
+    'FG': {'has_night_session': True, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 20, 'trading_start_time': '21:00'},
+    'TA': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'MA': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'SA': {'has_night_session': True, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 20, 'trading_start_time': '21:00'},
+    'L': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'V': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'EG': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'PP': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'EB': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'PG': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 20, 'trading_start_time': '21:00'},
+    'PX': {'has_night_session': True, 'margin_rate': {'long': 0.1, 'short': 0.1}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    
+    # 农产品
+    'RM': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'OI': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'CF': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'SR': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'PF': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'C': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'CS': {'has_night_session': True, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'CY': {'has_night_session': True, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'A': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'B': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'M': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'Y': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    'P': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '21:00'},
+    
+    # 无夜盘品种
+    'IF': {'has_night_session': False, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 300, 'trading_start_time': '09:30'},
+    'IH': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 300, 'trading_start_time': '09:30'},
+    'IC': {'has_night_session': False, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 200, 'trading_start_time': '09:30'},
+    'IM': {'has_night_session': False, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 200, 'trading_start_time': '09:30'},
+    'AP': {'has_night_session': False, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 10, 'trading_start_time': '09:00'},
+    'CJ': {'has_night_session': False, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '09:00'},
+    'PK': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 5, 'trading_start_time': '09:00'},
+    'JD': {'has_night_session': False, 'margin_rate': {'long': 0.11, 'short': 0.11}, 'multiplier': 10, 'trading_start_time': '09:00'},
+    'LH': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 16, 'trading_start_time': '09:00'},
+    'T': {'has_night_session': False, 'margin_rate': {'long': 0.03, 'short': 0.03}, 'multiplier': 1000000, 'trading_start_time': '09:30'},
+    'PS': {'has_night_session': False, 'margin_rate': {'long': 0.16, 'short': 0.16}, 'multiplier': 3, 'trading_start_time': '09:00'},
+    'UR': {'has_night_session': False, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 20, 'trading_start_time': '09:00'},
+    'MO': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 100, 'trading_start_time': '21:00'},
+    'HO': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 100, 'trading_start_time': '09:30'},
+    'LG': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 90, 'trading_start_time': '21:00'},
+    'EC': {'has_night_session': False, 'margin_rate': {'long': 0.23, 'short': 0.23}, 'multiplier': 50, 'trading_start_time': '09:00'},
+    'OP': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 40, 'trading_start_time': '09:00'},
+    'BC': {'has_night_session': True, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '21:00'},
+    'SH': {'has_night_session': True, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 30, 'trading_start_time': '21:00'},
+    'TS': {'has_night_session': False, 'margin_rate': {'long': 0.015, 'short': 0.015}, 'multiplier': 2000000, 'trading_start_time': '09:30'},
+    'AD': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 10, 'trading_start_time': '09:00'},
+    'PL': {'has_night_session': False, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 20, 'trading_start_time': '09:00'},
+    'SI': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '09:00'},
+    'SM': {'has_night_session': False, 'margin_rate': {'long': 0.15, 'short': 0.15}, 'multiplier': 5, 'trading_start_time': '09:00'},
+    'AO': {'has_night_session': True, 'margin_rate': {'long': 0.17, 'short': 0.17}, 'multiplier': 20, 'trading_start_time': '21:00'},
+    'TL': {'has_night_session': False, 'margin_rate': {'long': 0.045, 'short': 0.045}, 'multiplier': 1000000, 'trading_start_time': '09:00'},
+    'SF': {'has_night_session': False, 'margin_rate': {'long': 0.14, 'short': 0.14}, 'multiplier': 5, 'trading_start_time': '09:00'},
+    'PR': {'has_night_session': True, 'margin_rate': {'long': 0.12, 'short': 0.12}, 'multiplier': 15, 'trading_start_time': '21:00'},
+    'TF': {'has_night_session': False, 'margin_rate': {'long': 0.022, 'short': 0.022}, 'multiplier': 1000000, 'trading_start_time': '09:00'},
+    'BZ': {'has_night_session': False, 'margin_rate': {'long': 0.13, 'short': 0.13}, 'multiplier': 30, 'trading_start_time': '09:00'},
+}
+
+
+def extract_symbol_code(target_str):
+    """
+    从标的字段提取品种代码
+    例如: '原油2002(SC2002.XINE)' -> 'SC'
+    """
+    match = re.search(r'\(([A-Z]+)\d+\.', target_str)
+    if match:
+        return match.group(1)
+    return None
+
+
+DETAIL_KEYS = ['prev_close', 'open', 'MA5', 'MA10', 'MA20', 'MA30', 'MA60', 'AVG5']
+
+
+def parse_details(details_str):
+    """
+    将details列拆解成独立字段
+    """
+    if pd.isna(details_str):
+        return {}
+    result = {}
+    for part in str(details_str).split('|'):
+        if ':' not in part:
+            continue
+        key, value = part.split(':', 1)
+        key = key.strip()
+        raw_value = value.strip()
+        is_percent = False
+        if raw_value.endswith('%'):
+            is_percent = True
+            raw_value = raw_value[:-1]
+        try:
+            numeric_value = float(raw_value)
+            if is_percent:
+                numeric_value /= 100
+            result[key] = numeric_value
+        except (ValueError, TypeError):
+            result[key] = np.nan
+    return result
+
+
+def infer_trade_direction(trade_type):
+    """
+    根据交易类型推断方向(多/空)
+    """
+    if not isinstance(trade_type, str):
+        return 'unknown'
+    lowered = trade_type.lower()
+    if '多' in trade_type or 'long' in lowered:
+        return 'long'
+    if '空' in trade_type or 'short' in lowered:
+        return 'short'
+    return 'unknown'
+
+
+def evaluate_trend_alignment(row):
+    """
+    基于MA60、成交价、open与交易方向判断趋势是否一致
+    """
+    direction = infer_trade_direction(row.get('交易类型'))
+    ma60 = row.get('MA60')
+    trade_price = row.get('成交价')
+    open_price = row.get('open')
+    if direction == 'unknown' or pd.isna(ma60) or pd.isna(trade_price) or pd.isna(open_price):
+        return '数据不足'
+    if direction == 'long':
+        return '一致' if trade_price >= ma60 and open_price >= ma60 else '不一致'
+    if direction == 'short':
+        return '一致' if trade_price <= ma60 and open_price <= ma60 else '不一致'
+    return '数据不足'
+
+
+def calculate_ma_compaction(row, ma_columns):
+    """
+    计算多条均线的聚合度:标准差 / 均值
+    """
+    values = [row.get(col) for col in ma_columns]
+    if any(pd.isna(v) for v in values):
+        return np.nan
+    mean_val = np.mean(values)
+    if mean_val == 0:
+        return np.nan
+    std_val = np.std(values, ddof=0)
+    return std_val / mean_val
+
+
+def calculate_gap_ratio(prev_close, open_price):
+    """
+    计算跳空比例
+    """
+    if pd.isna(prev_close) or prev_close == 0 or pd.isna(open_price):
+        return np.nan
+    return abs(open_price - prev_close) / abs(prev_close)
+
+
+def calculate_relative_gap(prev_close, open_price, avg_5day_change):
+    """
+    计算相对跳空比例:跳空比率 / avg_5day_change
+    """
+    base_gap = calculate_gap_ratio(prev_close, open_price)
+    if pd.isna(base_gap) or pd.isna(avg_5day_change) or avg_5day_change == 0:
+        return np.nan
+    return base_gap / abs(avg_5day_change)
+
+
+def assign_quantile_labels(df, source_col, target_col, q=4):
+    """
+    根据分位数为连续指标打标签
+    """
+    labels = [f'Q{i+1}' for i in range(q)]
+    valid = df[source_col].dropna()
+    unique_count = valid.nunique()
+    if valid.empty or unique_count < 2:
+        df[target_col] = np.nan
+        return
+    bins = min(q, unique_count)
+    quantiles = pd.qcut(valid, q=bins, labels=labels[:bins], duplicates='drop')
+    df[target_col] = np.nan
+    df.loc[quantiles.index, target_col] = quantiles.astype(str)
+
+
+def calculate_time_segment(order_time_str, trading_start_time_str):
+    """
+    计算开仓时间相对于交易开始时间的时间段
+    返回: '<30分钟', '30-60分钟', '>1小时'
+    """
+    try:
+        # 解析时间字符串
+        order_time = datetime.strptime(order_time_str, '%H:%M:%S')
+        start_time = datetime.strptime(trading_start_time_str, '%H:%M')
+        
+        # 如果是夜盘品种(21:00开盘),需要特殊处理跨日情况
+        if trading_start_time_str == '21:00':
+            # 如果委托时间在21:00之前,说明是第二天的交易时间
+            if order_time.hour < 21 and order_time.hour >= 0:
+                # 加24小时处理跨日
+                order_time = order_time + timedelta(days=1)
+                
+        # 计算时间差(分钟)
+        time_diff = (order_time - start_time).total_seconds() / 60
+        
+        # 处理负数情况(可能是跨日)
+        if time_diff < 0:
+            time_diff += 24 * 60
+            
+        # 分类
+        if time_diff < 30:
+            return '<30分钟'
+        elif time_diff < 60:
+            return '30-60分钟'
+        else:
+            return '>1小时'
+    except Exception as e:
+        print(f"时间计算错误: {order_time_str}, {trading_start_time_str}, {e}")
+        return '未知'
+
+
+def calculate_session_type(order_time_str, has_night_session):
+    """
+    计算交易时段类型:夜盘、上午、下午
+    """
+    try:
+        order_time = datetime.strptime(order_time_str, '%H:%M:%S')
+        hour = order_time.hour
+        
+        if has_night_session and (hour >= 21 or hour < 3):
+            return '夜盘'
+        elif 9 <= hour < 12:
+            return '上午'
+        elif 12 <= hour < 16:
+            return '下午'
+        else:
+            return '其他'
+    except:
+        return '未知'
+
+
+def load_and_preprocess_data(csv_path):
+    """
+    加载并预处理数据
+    """
+    print("正在加载数据...")
+    df = pd.read_csv(csv_path)
+    
+    print(f"原始数据行数: {len(df)}")
+    print(f"数据列: {df.columns.tolist()}")
+
+    # 解析details列,补充所需字段
+    if 'details' in df.columns:
+        details_df = df['details'].apply(parse_details).apply(pd.Series)
+        for key in DETAIL_KEYS:
+            if key in details_df.columns:
+                df[key] = details_df[key]
+            elif key not in df.columns:
+                df[key] = np.nan
+    else:
+        for key in DETAIL_KEYS:
+            if key not in df.columns:
+                df[key] = np.nan
+
+    if '成交价' not in df.columns:
+        df['成交价'] = np.nan
+
+    avg5_col = 'avg_5day_change'
+    if avg5_col not in df.columns:
+        if 'AVG5' in df.columns:
+            df[avg5_col] = df['AVG5']
+        else:
+            df[avg5_col] = np.nan
+    else:
+        if 'AVG5' in df.columns:
+            df[avg5_col] = df[avg5_col].fillna(df['AVG5'])
+    
+    # 提取品种代码
+    df['品种代码'] = df['标的'].apply(extract_symbol_code)
+    
+    # 获取品种配置信息
+    df['trading_start_time'] = df['品种代码'].apply(
+        lambda x: FUTURES_CONFIG.get(x, {}).get('trading_start_time', None)
+    )
+    df['has_night_session'] = df['品种代码'].apply(
+        lambda x: FUTURES_CONFIG.get(x, {}).get('has_night_session', False)
+    )
+    
+    # 计算开盘后时间段
+    df['开盘后时间段'] = df.apply(
+        lambda row: calculate_time_segment(row['委托时间'], row['trading_start_time']) 
+        if pd.notna(row['trading_start_time']) else '未知',
+        axis=1
+    )
+    
+    # 计算交易时段
+    df['交易时段'] = df.apply(
+        lambda row: calculate_session_type(row['委托时间'], row['has_night_session']),
+        axis=1
+    )
+    
+    # 计算保证金收益率
+    df['保证金收益率'] = (df['交易盈亏'] / df['保证金']) * 100
+    
+    # 计算穿越均线数量
+    df['穿越均线数量'] = df['crossed_ma_lines'].apply(
+        lambda x: len(x.split(';')) if pd.notna(x) else 0
+    )
+    
+    # 判断是否盈利
+    df['是否盈利'] = df['交易盈亏'] > 0
+    
+    # 成交额分组
+    df['成交额分组'] = pd.cut(df['成交额'], 
+                              bins=[0, 100000, 200000, 500000, float('inf')],
+                              labels=['<10万', '10-20万', '20-50万', '>50万'])
+
+    # 趋势一致性与衍生指标
+    df['趋势一致'] = df.apply(evaluate_trend_alignment, axis=1)
+    df['均线聚合度_5_10_20_30'] = df.apply(
+        lambda row: calculate_ma_compaction(row, ['MA5', 'MA10', 'MA20', 'MA30']), axis=1
+    )
+    df['均线聚合度_5_10_20'] = df.apply(
+        lambda row: calculate_ma_compaction(row, ['MA5', 'MA10', 'MA20']), axis=1
+    )
+    assign_quantile_labels(df, '均线聚合度_5_10_20_30', '均线聚合度_5_10_20_30_分位')
+    assign_quantile_labels(df, '均线聚合度_5_10_20', '均线聚合度_5_10_20_分位')
+
+    df['跳空比率'] = df.apply(
+        lambda row: calculate_gap_ratio(row.get('prev_close'), row.get('open')), axis=1
+    )
+    df['跳空相对波动'] = df.apply(
+        lambda row: calculate_relative_gap(row.get('prev_close'), row.get('open'), row.get('avg_5day_change')),
+        axis=1
+    )
+    assign_quantile_labels(df, '跳空相对波动', '跳空相对波动分位')
+    
+    print(f"预处理后数据行数: {len(df)}")
+    print(f"品种代码提取成功率: {df['品种代码'].notna().sum() / len(df) * 100:.2f}%")
+    
+    return df
+
+
+def calculate_statistics(group_df):
+    """
+    计算统计指标
+    """
+    total_count = len(group_df)
+    win_count = (group_df['交易盈亏'] > 0).sum()
+    win_rate = win_count / total_count if total_count > 0 else 0
+    
+    avg_profit_loss = group_df['交易盈亏'].mean()
+    
+    # 计算盈亏比
+    profit_trades = group_df[group_df['交易盈亏'] > 0]['交易盈亏']
+    loss_trades = group_df[group_df['交易盈亏'] <= 0]['交易盈亏']
+    
+    avg_profit = profit_trades.mean() if len(profit_trades) > 0 else 0
+    avg_loss = abs(loss_trades.mean()) if len(loss_trades) > 0 else 0
+    profit_loss_ratio = avg_profit / avg_loss if avg_loss > 0 else np.inf
+    
+    avg_margin_return = group_df['保证金收益率'].mean()
+    
+    return pd.Series({
+        '出现次数': total_count,
+        '胜率': win_rate,
+        '平均盈亏': avg_profit_loss,
+        '盈亏比': profit_loss_ratio,
+        '平均保证金收益率': avg_margin_return
+    })
+
+
+def analyze_ma_lines(df):
+    """
+    分析crossed_ma_lines维度
+    """
+    print("\n" + "="*80)
+    print("均线组合分析")
+    print("="*80)
+    
+    ma_stats = df.groupby('crossed_ma_lines').apply(calculate_statistics).round(4)
+    ma_stats = ma_stats.sort_values('出现次数', ascending=False)
+    
+    print(ma_stats.to_string())
+    
+    return ma_stats
+
+
+def analyze_time_segment(df):
+    """
+    分析开盘后时间段维度
+    """
+    print("\n" + "="*80)
+    print("开盘后时间段分析")
+    print("="*80)
+    
+    time_stats = df.groupby('开盘后时间段').apply(calculate_statistics).round(4)
+    
+    # 按指定顺序排列
+    order = ['<30分钟', '30-60分钟', '>1小时', '未知']
+    time_stats = time_stats.reindex([o for o in order if o in time_stats.index])
+    
+    print(time_stats.to_string())
+    
+    return time_stats
+
+
+def analyze_cross_dimension(df):
+    """
+    交叉分析:均线组合 × 开盘后时间段
+    """
+    print("\n" + "="*80)
+    print("交叉分析:均线组合 × 开盘后时间段")
+    print("="*80)
+    
+    # 样本量分布
+    cross_count = pd.crosstab(df['crossed_ma_lines'], df['开盘后时间段'])
+    print("\n样本量分布:")
+    print(cross_count.to_string())
+    
+    # 胜率对比
+    cross_winrate = pd.crosstab(
+        df['crossed_ma_lines'], 
+        df['开盘后时间段'], 
+        values=df['是否盈利'], 
+        aggfunc='mean'
+    ).round(4)
+    print("\n胜率对比:")
+    print(cross_winrate.to_string())
+    
+    # 平均盈亏
+    cross_profit = pd.crosstab(
+        df['crossed_ma_lines'],
+        df['开盘后时间段'],
+        values=df['交易盈亏'],
+        aggfunc='mean'
+    ).round(2)
+    print("\n平均盈亏:")
+    print(cross_profit.to_string())
+    
+    # 平均保证金收益率
+    cross_return = pd.crosstab(
+        df['crossed_ma_lines'], 
+        df['开盘后时间段'], 
+        values=df['保证金收益率'], 
+        aggfunc='mean'
+    ).round(4)
+    print("\n平均保证金收益率(%):")
+    print(cross_return.to_string())
+    
+    return cross_count, cross_winrate, cross_profit, cross_return
+
+
+def analyze_trade_type_and_variety(df):
+    """
+    分析交易类型和品种维度
+    """
+    print("\n" + "="*80)
+    print("交易类型分析")
+    print("="*80)
+    
+    trade_type_stats = df.groupby('交易类型').apply(calculate_statistics).round(4)
+    print(trade_type_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("品种类型分析")
+    print("="*80)
+    
+    variety_stats = df.groupby('品种').apply(calculate_statistics).round(4)
+    print(variety_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("具体品种代码分析(前20名)")
+    print("="*80)
+    
+    symbol_stats = df.groupby('品种代码').apply(calculate_statistics).round(4)
+    symbol_stats = symbol_stats.sort_values('出现次数', ascending=False).head(20)
+    print(symbol_stats.to_string())
+    
+    return trade_type_stats, variety_stats, symbol_stats
+
+
+def analyze_additional_dimensions(df):
+    """
+    其他维度分析
+    """
+    print("\n" + "="*80)
+    print("成交额分组分析")
+    print("="*80)
+    
+    amount_stats = df.groupby('成交额分组').apply(calculate_statistics).round(4)
+    print(amount_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("交易时段分析")
+    print("="*80)
+    
+    session_stats = df.groupby('交易时段').apply(calculate_statistics).round(4)
+    print(session_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("穿越均线数量分析")
+    print("="*80)
+    
+    ma_count_stats = df.groupby('穿越均线数量').apply(calculate_statistics).round(4)
+    print(ma_count_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("多空对比(按均线组合)- 前10个组合")
+    print("="*80)
+    
+    # 获取出现次数最多的前10个均线组合
+    top_ma_lines = df['crossed_ma_lines'].value_counts().head(10).index
+    df_top = df[df['crossed_ma_lines'].isin(top_ma_lines)]
+    
+    long_short_stats = df_top.groupby(['crossed_ma_lines', '交易类型']).apply(
+        calculate_statistics
+    ).round(4)
+    print(long_short_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("品种特性分析(有夜盘 vs 无夜盘)")
+    print("="*80)
+    
+    night_session_stats = df.groupby('has_night_session').apply(calculate_statistics).round(4)
+    night_session_stats.index = ['无夜盘', '有夜盘']
+    print(night_session_stats.to_string())
+    
+    print("\n" + "="*80)
+    print("组合策略分析:最佳组合(样本量>=10)")
+    print("="*80)
+    
+    # 三维组合分析
+    combo_stats = df.groupby(['crossed_ma_lines', '开盘后时间段', '交易类型']).apply(
+        calculate_statistics
+    ).round(4)
+    
+    # 筛选样本量>=10的组合
+    combo_stats = combo_stats[combo_stats['出现次数'] >= 10]
+    
+    # 按保证金收益率排序,显示前10
+    combo_stats_sorted = combo_stats.sort_values('平均保证金收益率', ascending=False).head(10)
+    print("\n保证金收益率最高的10个组合:")
+    print(combo_stats_sorted.to_string())
+    
+    # 按胜率排序,显示前10
+    combo_stats_sorted_winrate = combo_stats.sort_values('胜率', ascending=False).head(10)
+    print("\n胜率最高的10个组合:")
+    print(combo_stats_sorted_winrate.to_string())
+    
+    return {
+        'amount_stats': amount_stats,
+        'session_stats': session_stats,
+        'ma_count_stats': ma_count_stats,
+        'long_short_stats': long_short_stats,
+        'night_session_stats': night_session_stats,
+        'combo_stats': combo_stats
+    }
+
+
+def analyze_trend_alignment(df):
+    """
+    趋势一致性分析
+    """
+    print("\n" + "="*80)
+    print("趋势一致性分析(基于MA60 vs 成交价/open)")
+    print("="*80)
+    
+    if df['趋势一致'].dropna().empty:
+        print("暂无可用数据")
+        return None
+    
+    trend_stats = df.groupby('趋势一致').apply(calculate_statistics).round(4)
+    print(trend_stats.to_string())
+    
+    return trend_stats
+
+
+def analyze_ma_compaction(df):
+    """
+    均线聚合度分析
+    """
+    print("\n" + "="*80)
+    print("均线聚合度分析(标准差/均值)")
+    print("="*80)
+    
+    compaction_results = {}
+    config = [
+        ('均线聚合度_5_10_20_30', 'MA5/MA10/MA20/MA30', '均线聚合度_5_10_20_30_分位', 'ma_compaction_ma5_ma30'),
+        ('均线聚合度_5_10_20', 'MA5/MA10/MA20', '均线聚合度_5_10_20_分位', 'ma_compaction_ma5_ma20')
+    ]
+    
+    for col, label, quantile_col, result_key in config:
+        print(f"\n--- {label} ---")
+        if col not in df.columns or df[col].dropna().empty:
+            print("数据不足,无法分析。")
+            compaction_results[result_key] = None
+            continue
+        
+        print(f"{label} 描述统计:均值={df[col].mean():.4f}, 中位数={df[col].median():.4f}, 最大值={df[col].max():.4f}")
+        if quantile_col not in df.columns or df[quantile_col].dropna().empty:
+            print("分位标签缺失,跳过统计。")
+            compaction_results[result_key] = None
+            continue
+        
+        stats = df.groupby(quantile_col).apply(calculate_statistics).round(4)
+        print(stats.to_string())
+        compaction_results[result_key] = stats
+    
+    return compaction_results
+
+
+def analyze_gap_behavior(df):
+    """
+    跳空行为分析
+    """
+    print("\n" + "="*80)
+    print("跳空行为分析")
+    print("="*80)
+    
+    if '跳空比率' not in df.columns or df['跳空比率'].dropna().empty:
+        print("缺少跳空数据,无法分析。")
+        return None
+    
+    print(f"跳空比率描述:均值={df['跳空比率'].mean():.4f}, 最大值={df['跳空比率'].max():.4f}")
+    if '跳空相对波动' in df.columns and not df['跳空相对波动'].dropna().empty:
+        print(f"跳空相对波动描述:均值={df['跳空相对波动'].mean():.4f}, 最大值={df['跳空相对波动'].max():.4f}")
+    
+    if '跳空相对波动分位' not in df.columns or df['跳空相对波动分位'].dropna().empty:
+        print("跳空相对波动分位标签缺失,跳过分组统计。")
+        return None
+    
+    gap_stats = df.groupby('跳空相对波动分位').apply(calculate_statistics).round(4)
+    print("\n按跳空相对波动分位的表现:")
+    print(gap_stats.to_string())
+    
+    return gap_stats
+
+
+def analyze_enhanced_cross_metrics(df):
+    """
+    将新增指标与核心维度(均线组合、开盘后时间段)交叉对比
+    """
+    print("\n" + "="*80)
+    print("扩展指标交叉分析(趋势一致/均线聚合度/跳空 vs 核心维度)")
+    print("="*80)
+
+    config = [
+        (['crossed_ma_lines', '趋势一致'], '趋势一致 × 均线组合', 'trend_vs_ma'),
+        (['开盘后时间段', '趋势一致'], '趋势一致 × 开盘后时间段', 'trend_vs_time'),
+        (['crossed_ma_lines', '均线聚合度_5_10_20_30_分位'], '均线聚合度(4) × 均线组合', 'ma_compact4_vs_ma'),
+        (['开盘后时间段', '均线聚合度_5_10_20_30_分位'], '均线聚合度(4) × 开盘后时间段', 'ma_compact4_vs_time'),
+        (['crossed_ma_lines', '均线聚合度_5_10_20_分位'], '均线聚合度(3) × 均线组合', 'ma_compact3_vs_ma'),
+        (['开盘后时间段', '均线聚合度_5_10_20_分位'], '均线聚合度(3) × 开盘后时间段', 'ma_compact3_vs_time'),
+        (['crossed_ma_lines', '跳空相对波动分位'], '跳空相对波动 × 均线组合', 'gap_vs_ma'),
+        (['开盘后时间段', '跳空相对波动分位'], '跳空相对波动 × 开盘后时间段', 'gap_vs_time'),
+    ]
+
+    results = {}
+    for group_cols, title, key in config:
+        missing_cols = [col for col in group_cols if col not in df.columns]
+        if missing_cols:
+            print(f"\n{title}: 缺少列 {missing_cols},跳过。")
+            results[key] = None
+            continue
+        if df[group_cols[1]].dropna().empty:
+            print(f"\n{title}: 数据不足,跳过。")
+            results[key] = None
+            continue
+        stats = df.groupby(group_cols).apply(calculate_statistics).round(4)
+        print(f"\n{title}")
+        print(stats.to_string())
+        results[key] = stats
+
+    return results
+
+
+def create_visualizations(df, ma_stats, time_stats, cross_winrate, cross_profit, cross_return, output_dir):
+    """
+    创建数据可视化图表
+    """
+    print("\n" + "="*80)
+    print("生成可视化图表...")
+    print("="*80)
+    
+    # 创建输出目录
+    os.makedirs(output_dir, exist_ok=True)
+    
+    def annotate_barh(ax, bars, formatter=lambda v: f"{v:.0f}", offset_ratio=0.01):
+        """
+        为水平柱状图添加数值标注
+        """
+        if bars is None or len(bars) == 0:
+            return
+        max_width = max((bar.get_width() for bar in bars), default=0)
+        offset = max(max_width * offset_ratio, 0.5)
+        for bar in bars:
+            width = bar.get_width()
+            if np.isnan(width):
+                continue
+            ha = 'left'
+            x = width + offset
+            if width < 0:
+                ha = 'right'
+                x = width - offset
+            y = bar.get_y() + bar.get_height() / 2
+            ax.text(x, y, formatter(width), va='center', ha=ha, fontsize=9)
+    
+    def annotate_bar(ax, bars, formatter=lambda v: f"{v:.0f}", offset_ratio=0.01):
+        """
+        为垂直柱状图添加数值标注
+        """
+        if bars is None or len(bars) == 0:
+            return
+        max_height = max((bar.get_height() for bar in bars), default=0)
+        offset = max(max_height * offset_ratio, 0.5)
+        for bar in bars:
+            height = bar.get_height()
+            if np.isnan(height):
+                continue
+            va = 'bottom'
+            y = height + offset
+            if height < 0:
+                va = 'top'
+                y = height - offset
+            x = bar.get_x() + bar.get_width() / 2
+            ax.text(x, y, formatter(height), va=va, ha='center', fontsize=9)
+    
+    # 1. 均线组合表现对比(前15个)
+    fig, axes = plt.subplots(2, 2, figsize=(16, 12))
+    
+    top_ma = ma_stats.head(15)
+    
+    # 出现次数
+    bars = axes[0, 0].barh(range(len(top_ma)), top_ma['出现次数'])
+    axes[0, 0].set_yticks(range(len(top_ma)))
+    axes[0, 0].set_yticklabels(top_ma.index)
+    axes[0, 0].set_xlabel('出现次数')
+    axes[0, 0].set_title('均线组合出现次数(Top 15)')
+    axes[0, 0].invert_yaxis()
+    annotate_barh(axes[0, 0], bars)
+    
+    # 胜率
+    colors = ['green' if x > 0.5 else 'red' for x in top_ma['胜率']]
+    bars = axes[0, 1].barh(range(len(top_ma)), top_ma['胜率'], color=colors)
+    axes[0, 1].set_yticks(range(len(top_ma)))
+    axes[0, 1].set_yticklabels(top_ma.index)
+    axes[0, 1].set_xlabel('胜率')
+    axes[0, 1].set_title('均线组合胜率(Top 15)')
+    axes[0, 1].axvline(x=0.5, color='black', linestyle='--', alpha=0.5)
+    axes[0, 1].invert_yaxis()
+    annotate_barh(axes[0, 1], bars, formatter=lambda v: f"{v:.1%}", offset_ratio=0.02)
+    
+    # 平均盈亏
+    colors = ['green' if x > 0 else 'red' for x in top_ma['平均盈亏']]
+    bars = axes[1, 0].barh(range(len(top_ma)), top_ma['平均盈亏'], color=colors)
+    axes[1, 0].set_yticks(range(len(top_ma)))
+    axes[1, 0].set_yticklabels(top_ma.index)
+    axes[1, 0].set_xlabel('平均盈亏(元)')
+    axes[1, 0].set_title('均线组合平均盈亏(Top 15)')
+    axes[1, 0].axvline(x=0, color='black', linestyle='--', alpha=0.5)
+    axes[1, 0].invert_yaxis()
+    annotate_barh(axes[1, 0], bars, formatter=lambda v: f"{v:,.0f}", offset_ratio=0.015)
+    
+    # 保证金收益率
+    colors = ['green' if x > 0 else 'red' for x in top_ma['平均保证金收益率']]
+    bars = axes[1, 1].barh(range(len(top_ma)), top_ma['平均保证金收益率'], color=colors)
+    axes[1, 1].set_yticks(range(len(top_ma)))
+    axes[1, 1].set_yticklabels(top_ma.index)
+    axes[1, 1].set_xlabel('平均保证金收益率(%)')
+    axes[1, 1].set_title('均线组合平均保证金收益率(Top 15)')
+    axes[1, 1].axvline(x=0, color='black', linestyle='--', alpha=0.5)
+    axes[1, 1].invert_yaxis()
+    annotate_barh(axes[1, 1], bars, formatter=lambda v: f"{v:.2f}%", offset_ratio=0.02)
+    
+    plt.tight_layout()
+    plt.savefig(os.path.join(output_dir, 'ma_lines_analysis.png'), dpi=150, bbox_inches='tight')
+    print(f"已保存: {os.path.join(output_dir, 'ma_lines_analysis.png')}")
+    plt.close()
+    
+    # 2. 开盘后时间段表现
+    fig, axes = plt.subplots(1, 2, figsize=(14, 6))
+    order = ['<30分钟', '30-60分钟', '>1小时']
+    time_stats_filtered = time_stats[time_stats.index.isin(order)]
+    time_stats_filtered = time_stats_filtered.loc[[idx for idx in order if idx in time_stats_filtered.index]]
+    
+    profit_colors = ['green' if val >= 0 else 'red' for val in time_stats_filtered['平均盈亏']]
+    bars = axes[0].bar(range(len(time_stats_filtered)), time_stats_filtered['平均盈亏'], color=profit_colors)
+    axes[0].set_xticks(range(len(time_stats_filtered)))
+    axes[0].set_xticklabels(time_stats_filtered.index)
+    axes[0].set_ylabel('平均盈亏(元)')
+    axes[0].set_title('不同时间段平均盈亏')
+    axes[0].axhline(y=0, color='black', linestyle='--', alpha=0.5)
+    annotate_bar(axes[0], bars, formatter=lambda v: f"{v:,.0f}")
+    
+    margin_colors = ['green' if val >= 0 else 'red' for val in time_stats_filtered['平均保证金收益率']]
+    bars = axes[1].bar(range(len(time_stats_filtered)), time_stats_filtered['平均保证金收益率'], color=margin_colors)
+    axes[1].set_xticks(range(len(time_stats_filtered)))
+    axes[1].set_xticklabels(time_stats_filtered.index)
+    axes[1].set_ylabel('平均保证金收益率(%)')
+    axes[1].set_title('不同时间段平均保证金收益率')
+    axes[1].axhline(y=0, color='black', linestyle='--', alpha=0.5)
+    annotate_bar(axes[1], bars, formatter=lambda v: f"{v:.2f}%")
+    
+    plt.tight_layout()
+    plt.savefig(os.path.join(output_dir, 'time_segment_analysis.png'), dpi=150, bbox_inches='tight')
+    print(f"已保存: {os.path.join(output_dir, 'time_segment_analysis.png')}")
+    plt.close()
+    
+    # 3. 交叉分析热力图
+    fig, axes = plt.subplots(1, 3, figsize=(22, 10))
+    
+    # 选择前15个均线组合
+    top_ma_lines = ma_stats.head(15).index
+    heatmap_cols = ['<30分钟', '30-60分钟', '>1小时']
+    
+    def prepare_heatmap(table):
+        filtered = table.reindex(index=[idx for idx in top_ma_lines if idx in table.index])
+        if filtered.empty:
+            return filtered
+        cols = [col for col in heatmap_cols if col in filtered.columns]
+        if cols:
+            filtered = filtered[cols]
+        return filtered
+    
+    cross_winrate_filtered = prepare_heatmap(cross_winrate)
+    cross_profit_filtered = prepare_heatmap(cross_profit)
+    cross_return_filtered = prepare_heatmap(cross_return)
+    
+    # 胜率热力图
+    sns.heatmap(cross_winrate_filtered, annot=True, fmt='.2f', cmap='RdYlGn', 
+                center=0.5, vmin=0, vmax=1, ax=axes[0], cbar_kws={'label': '胜率'})
+    axes[0].set_title('均线组合 × 时间段 胜率热力图(Top 15)')
+    axes[0].set_xlabel('开盘后时间段')
+    axes[0].set_ylabel('均线组合')
+    
+    # 平均盈亏热力图
+    sns.heatmap(cross_profit_filtered, annot=True, fmt='.0f', cmap='RdYlGn', center=0,
+                ax=axes[1], cbar_kws={'label': '平均盈亏(元)'})
+    axes[1].set_title('均线组合 × 时间段 平均盈亏热力图(Top 15)')
+    axes[1].set_xlabel('开盘后时间段')
+    axes[1].set_ylabel('均线组合')
+    
+    # 平均保证金收益率热力图
+    sns.heatmap(cross_return_filtered, annot=True, fmt='.2f', cmap='RdYlGn', center=0,
+                ax=axes[2], cbar_kws={'label': '平均保证金收益率(%)'})
+    axes[2].set_title('均线组合 × 时间段 平均保证金收益率热力图(Top 15)')
+    axes[2].set_xlabel('开盘后时间段')
+    axes[2].set_ylabel('均线组合')
+    
+    plt.tight_layout()
+    plt.savefig(os.path.join(output_dir, 'cross_analysis_heatmap.png'), dpi=150, bbox_inches='tight')
+    print(f"已保存: {os.path.join(output_dir, 'cross_analysis_heatmap.png')}")
+    plt.close()
+    
+    # 4. 品种表现分析
+    fig, axes = plt.subplots(2, 2, figsize=(16, 12))
+    
+    # 交易类型对比
+    trade_type_stats = df.groupby('交易类型').apply(calculate_statistics)
+    axes[0, 0].bar(trade_type_stats.index, trade_type_stats['胜率'], 
+                   color=['green', 'red'])
+    axes[0, 0].set_ylabel('胜率')
+    axes[0, 0].set_title('交易类型胜率对比')
+    axes[0, 0].axhline(y=0.5, color='black', linestyle='--', alpha=0.5)
+    
+    axes[0, 1].bar(trade_type_stats.index, trade_type_stats['平均保证金收益率'],
+                   color=['green', 'red'])
+    axes[0, 1].set_ylabel('平均保证金收益率(%)')
+    axes[0, 1].set_title('交易类型保证金收益率对比')
+    axes[0, 1].axhline(y=0, color='black', linestyle='--', alpha=0.5)
+    
+    # 品种类型对比
+    variety_stats = df.groupby('品种').apply(calculate_statistics)
+    axes[1, 0].bar(variety_stats.index, variety_stats['胜率'])
+    axes[1, 0].set_ylabel('胜率')
+    axes[1, 0].set_title('品种类型胜率对比')
+    axes[1, 0].axhline(y=0.5, color='black', linestyle='--', alpha=0.5)
+    
+    axes[1, 1].bar(variety_stats.index, variety_stats['平均保证金收益率'])
+    axes[1, 1].set_ylabel('平均保证金收益率(%)')
+    axes[1, 1].set_title('品种类型保证金收益率对比')
+    axes[1, 1].axhline(y=0, color='black', linestyle='--', alpha=0.5)
+    
+    plt.tight_layout()
+    plt.savefig(os.path.join(output_dir, 'variety_analysis.png'), dpi=150, bbox_inches='tight')
+    print(f"已保存: {os.path.join(output_dir, 'variety_analysis.png')}")
+    plt.close()
+
+    # 5. 扩展指标热力图
+    def build_metric_pivot(source_df, row_field, col_field, value_field, row_order=None, col_order=None):
+        filtered = source_df.dropna(subset=[col_field])
+        if row_order is not None:
+            filtered = filtered[filtered[row_field].isin(row_order)]
+        pivot = pd.pivot_table(
+            filtered,
+            index=row_field,
+            columns=col_field,
+            values=value_field,
+            aggfunc='mean'
+        )
+        if row_order is not None:
+            pivot = pivot.reindex([idx for idx in row_order if idx in pivot.index])
+        if col_order is not None:
+            pivot = pivot[[col for col in col_order if col in pivot.columns]]
+        return pivot
+
+    def plot_heatmap(ax, data, title, fmt='.2f', center=None, vmin=None, vmax=None, cbar_label=''):
+        if data.empty:
+            ax.axis('off')
+            ax.set_title(f"{title}(数据不足)")
+            return
+        sns.heatmap(
+            data,
+            annot=True,
+            fmt=fmt,
+            cmap='RdYlGn',
+            center=center,
+            vmin=vmin,
+            vmax=vmax,
+            ax=ax,
+            cbar_kws={'label': cbar_label}
+        )
+        ax.set_title(title)
+        ax.set_xlabel(data.columns.name or '')
+        ax.set_ylabel(data.index.name or '')
+
+    enhanced_metric_configs = [
+        ('趋势一致', 'trend_alignment_cross.png', '趋势一致', ['一致', '不一致', '数据不足']),
+        ('均线聚合度_5_10_20_30_分位', 'ma_compaction_4lines_cross.png', '均线聚合度(MA5/MA10/MA20/MA30)', None),
+        ('均线聚合度_5_10_20_分位', 'ma_compaction_3lines_cross.png', '均线聚合度(MA5/MA10/MA20)', None),
+        ('跳空相对波动分位', 'gap_behavior_cross.png', '跳空相对波动', None),
+    ]
+
+    row_configs = [
+        ('crossed_ma_lines', 'Top 15 均线组合', list(ma_stats.head(15).index)),
+        ('开盘后时间段', '开盘后时间段', ['<30分钟', '30-60分钟', '>1小时', '未知']),
+    ]
+
+    value_configs = [
+        ('是否盈利', '胜率', '.2f', 0.5, 0, 1, '胜率'),
+        ('交易盈亏', '平均盈亏', '.0f', 0, None, None, '平均盈亏(元)'),
+        ('保证金收益率', '平均保证金收益率', '.2f', 0, None, None, '平均保证金收益率(%)'),
+    ]
+
+    for metric_field, filename, metric_title, col_order in enhanced_metric_configs:
+        if metric_field not in df.columns or df[metric_field].dropna().empty:
+            continue
+        fig, axes = plt.subplots(len(row_configs), len(value_configs), figsize=(22, 12))
+        for row_idx, (row_field, row_label, row_order) in enumerate(row_configs):
+            for col_idx, (value_field, value_label, fmt, center, vmin, vmax, cbar_label) in enumerate(value_configs):
+                ax = axes[row_idx, col_idx]
+                pivot = build_metric_pivot(df, row_field, metric_field, value_field, row_order=row_order, col_order=col_order)
+                ax.set_title(f"{row_label} - {value_label}")
+                plot_heatmap(
+                    ax,
+                    pivot,
+                    f"{row_label} - {value_label}",
+                    fmt=fmt,
+                    center=center,
+                    vmin=vmin,
+                    vmax=vmax,
+                    cbar_label=cbar_label
+                )
+                ax.set_xlabel(metric_title)
+                ax.set_ylabel(row_label)
+        plt.suptitle(f"{metric_title} × 核心维度表现", fontsize=16)
+        plt.tight_layout(rect=[0, 0, 1, 0.97])
+        output_path = os.path.join(output_dir, filename)
+        plt.savefig(output_path, dpi=150, bbox_inches='tight')
+        print(f"已保存: {output_path}")
+        plt.close()
+    
+    print("\n所有图表已生成!")
+
+
+def save_results_to_csv(df, ma_stats, time_stats, output_dir,
+                        trend_alignment_stats=None, ma_compaction_stats=None,
+                        gap_stats=None, enhanced_cross_stats=None):
+    """
+    保存分析结果到CSV
+    """
+    print("\n" + "="*80)
+    print("保存分析结果到CSV...")
+    print("="*80)
+    
+    # 保存增强后的原始数据
+    output_file = os.path.join(output_dir, 'records_enhanced.csv')
+    df.to_csv(output_file, index=False, encoding='utf-8-sig')
+    print(f"已保存增强数据: {output_file}")
+    
+    # 保存均线组合统计
+    output_file = os.path.join(output_dir, 'ma_lines_stats.csv')
+    ma_stats.to_csv(output_file, encoding='utf-8-sig')
+    print(f"已保存均线组合统计: {output_file}")
+    
+    # 保存时间段统计
+    output_file = os.path.join(output_dir, 'time_segment_stats.csv')
+    time_stats.to_csv(output_file, encoding='utf-8-sig')
+    print(f"已保存时间段统计: {output_file}")
+    
+    # 保存品种统计
+    symbol_stats = df.groupby('品种代码').apply(calculate_statistics)
+    output_file = os.path.join(output_dir, 'symbol_stats.csv')
+    symbol_stats.to_csv(output_file, encoding='utf-8-sig')
+    print(f"已保存品种统计: {output_file}")
+    
+    # 保存组合策略统计
+    combo_stats = df.groupby(['crossed_ma_lines', '开盘后时间段', '交易类型']).apply(
+        calculate_statistics
+    )
+    combo_stats = combo_stats[combo_stats['出现次数'] >= 5]
+    combo_stats = combo_stats.sort_values('平均保证金收益率', ascending=False)
+    output_file = os.path.join(output_dir, 'combo_strategy_stats.csv')
+    combo_stats.to_csv(output_file, encoding='utf-8-sig')
+    print(f"已保存组合策略统计: {output_file}")
+
+    if trend_alignment_stats is not None:
+        output_file = os.path.join(output_dir, 'trend_alignment_stats.csv')
+        trend_alignment_stats.to_csv(output_file, encoding='utf-8-sig')
+        print(f"已保存趋势一致性统计: {output_file}")
+
+    if ma_compaction_stats:
+        for key, stats in ma_compaction_stats.items():
+            if stats is None:
+                continue
+            output_file = os.path.join(output_dir, f'{key}_stats.csv')
+            stats.to_csv(output_file, encoding='utf-8-sig')
+            print(f"已保存均线聚合度统计: {output_file}")
+
+    if gap_stats is not None:
+        output_file = os.path.join(output_dir, 'gap_behavior_stats.csv')
+        gap_stats.to_csv(output_file, encoding='utf-8-sig')
+        print(f"已保存跳空行为统计: {output_file}")
+
+    if enhanced_cross_stats:
+        for key, stats in enhanced_cross_stats.items():
+            if stats is None:
+                continue
+            output_file = os.path.join(output_dir, f'{key}_stats.csv')
+            stats.to_csv(output_file, encoding='utf-8-sig')
+            print(f"已保存 {key} 交叉统计: {output_file}")
+
+
+def main():
+    """
+    主函数
+    """
+    # 设置路径
+    script_dir = os.path.dirname(os.path.abspath(__file__))
+    csv_path = os.path.join(script_dir, 'records.csv')
+    output_dir = os.path.join('data', 'future', 'analysis_results')
+    
+    # 检查文件是否存在
+    if not os.path.exists(csv_path):
+        print(f"错误: 找不到文件 {csv_path}")
+        return
+    
+    print("="*80)
+    print("期货开仓记录分析工具")
+    print("="*80)
+    
+    # 加载和预处理数据
+    df = load_and_preprocess_data(csv_path)
+    
+    # 进行各维度分析
+    ma_stats = analyze_ma_lines(df)
+    time_stats = analyze_time_segment(df)
+    cross_count, cross_winrate, cross_profit, cross_return = analyze_cross_dimension(df)
+    trade_type_stats, variety_stats, symbol_stats = analyze_trade_type_and_variety(df)
+    additional_stats = analyze_additional_dimensions(df)
+    trend_alignment_stats = analyze_trend_alignment(df)
+    ma_compaction_stats = analyze_ma_compaction(df)
+    gap_stats = analyze_gap_behavior(df)
+    enhanced_cross_stats = analyze_enhanced_cross_metrics(df)
+    
+    # 生成可视化图表
+    create_visualizations(df, ma_stats, time_stats, cross_winrate, cross_profit, cross_return, output_dir)
+    
+    # 保存结果到CSV
+    save_results_to_csv(
+        df,
+        ma_stats,
+        time_stats,
+        output_dir,
+        trend_alignment_stats=trend_alignment_stats,
+        ma_compaction_stats=ma_compaction_stats,
+        gap_stats=gap_stats,
+        enhanced_cross_stats=enhanced_cross_stats
+    )
+    
+    print("\n" + "="*80)
+    print("分析完成!")
+    print(f"结果保存在: {output_dir}")
+    print("="*80)
+
+
+if __name__ == '__main__':
+    main()
+

+ 122 - 0
Lib/future/tools_documentation.md

@@ -0,0 +1,122 @@
+# 期货交易工具逻辑文档
+
+## 1. 交易训练工具 (trading_training_tool.py)
+
+### 功能概述
+交易训练工具是一个用于训练交易决策能力的交互式工具。它从交易记录中随机选择一笔开仓交易,只显示历史K线数据和当天的部分信息(当天收盘价被替换为实际成交价),让用户判断是否开仓,然后显示包含未来20天数据的完整K线图,并记录用户的决策结果。
+
+### 核心逻辑流程
+
+#### 1.1 数据准备阶段
+1. **读取交易数据**
+   - 从CSV文件读取所有交易记录
+   - 支持多种编码格式(utf-8, gbk, gb2312等)
+   - 自动识别文件编码
+
+2. **加载历史处理记录**
+   - 读取已存在的训练结果文件(training_results.csv)
+   - 提取已处理的交易对ID列表,避免重复处理
+
+3. **提取开仓交易**
+   - 遍历所有交易记录,筛选出开仓交易(交易类型以"开"开头)
+   - 提取合约代码、交易日期、成交价、方向等关键信息
+   - 跳过已处理的交易对
+   - 计算对应的平仓盈亏(考虑连续交易对ID的情况)
+
+#### 1.2 随机选择机制
+- **完全随机选择**:使用random.shuffle打乱交易列表,然后随机选择
+- **避免重复**:通过已处理的交易对ID列表确保不重复选择
+- **显示统计**:显示剩余未处理的交易数量
+
+#### 1.3 K线数据处理
+1. **获取完整数据**
+   - 获取历史100天 + 未来20天的完整K线数据
+   - 包含开高低收价格和5/10/20/30日均线
+
+2. **部分K线图生成**
+   - 截取历史数据+当天数据
+   - 将当天收盘价替换为实际成交价
+   - 生成只有历史信息的K线图
+
+3. **完整K线图生成**
+   - 使用原始数据(不修改当天收盘价)
+   - 显示历史和未来20天的完整走势
+   - 用灰色背景标注未来数据区域
+
+#### 1.4 用户交互
+1. **显示交易信息**
+   - 合约代码
+   - 交易日期
+   - 交易方向(多头/空头)
+   - 成交价
+
+2. **获取用户决策**
+   - 提示用户输入'y'(开仓)或'n'(不开仓)
+   - 循环验证输入直到获得有效决策
+
+#### 1.5 结果记录
+1. **计算判定收益**
+   - 用户判定开仓:判定收益 = 实际平仓盈亏
+   - 用户判定不开仓:判定收益 = -实际平仓盈亏
+
+2. **记录格式**
+   - 包含原始交易信息(日期、时间、标的、类型、数量、价格)
+   - 平仓盈亏(可能合并连续交易对)
+   - 用户判定(开仓/不开仓)
+   - 判定收益
+   - 交易对ID和连续交易对ID
+
+### 特殊处理逻辑
+
+#### 连续交易对ID处理
+- 如果记录包含有效的连续交易对ID(非'N/A'),则合并所有同一连续交易对ID的平仓盈亏
+- 这允许处理跨期换月等连续持仓的情况
+
+#### 日期和时间处理
+- 支持多种日期格式(YYYY-MM-DD, DD/MM/YYYY等)
+- 夜盘交易处理:委托时间>=21:00的交易,实际交易日为下一个交易日
+- 使用JoinQuant的get_trade_days获取准确的交易日历
+
+#### 合约代码提取
+- 从"标的"列的括号中提取标准合约代码(如:"豆粕2501(DM2501)" -> "DM2501")
+- 自动添加交易所后缀(如.XDCE)
+
+### 配置参数
+所有参数集中在CONFIG字典中:
+- `csv_filename`: 输入交易记录文件名
+- `result_filename`: 输出结果文件名
+- `history_days`: 历史数据天数(默认100)
+- `future_days`: 未来数据天数(默认20)
+- `output_dir`: 图片输出目录
+- `show_plots`: 是否显示图片
+- `plot_dpi`: 图片分辨率
+- `random_seed`: 随机数种子(确保可重复性)
+
+### 输出文件
+1. **训练结果CSV**(training_results.csv)
+   - 追加模式写入,保留所有历史记录
+   - UTF-8编码,支持中文
+
+2. **K线图片**(training_images/目录)
+   - partial_*.png: 部分K线图
+   - full_*.png: 完整K线图
+
+### 使用场景
+- 交易决策训练:通过历史数据练习判断开仓时机
+- 策略验证:验证人工判断与策略信号的一致性
+- 交易复盘:分析决策质量,改进交易技能
+
+## 2. K线复原工具 (kline_reconstruction.py)
+
+### 功能概述
+(待补充详细逻辑说明)
+
+### 核心功能
+- 从交易记录中提取所有开仓交易
+- 为每个开仓交易生成对应的K线图
+- 包含均线和技术指标
+- 批量生成并打包成ZIP文件
+
+---
+
+*注:本文档主要介绍交易训练工具的详细逻辑,其他工具的逻辑说明将逐步补充。*

+ 937 - 0
Lib/future/trading_training_tool.py

@@ -0,0 +1,937 @@
+# 交易训练工具
+# 用于从交易记录CSV文件中随机选择交易,显示部分K线图让用户判断是否开仓,然后显示完整结果并记录
+
+from jqdata import *
+import pandas as pd
+import numpy as np
+import matplotlib.pyplot as plt
+import matplotlib.patches as patches
+from datetime import datetime, timedelta, date
+import re
+import os
+import random
+import warnings
+warnings.filterwarnings('ignore')
+
+# 中文字体设置
+plt.rcParams['font.sans-serif'] = ['SimHei', 'Microsoft YaHei', 'DejaVu Sans']
+plt.rcParams['axes.unicode_minus'] = False
+
+# ========== 参数配置区域(硬编码参数都在这里) ==========
+CONFIG = {
+    # CSV文件名
+    'csv_filename': 'transaction1.csv',
+
+    # 结果记录文件名
+    'result_filename': 'training_results.csv',
+
+    # 历史数据天数
+    'history_days': 100,
+
+    # 未来数据天数
+    'future_days': 20,
+
+    # 输出目录
+    'output_dir': 'training_images',
+
+    # 是否显示图片(在某些环境下可能需要设置为False)
+    'show_plots': True,
+
+    # 图片DPI
+    'plot_dpi': 150,
+
+    # 随机种子(设置为None表示不固定种子)
+    'random_seed': 42
+}
+# =====================================================
+
+
+def _get_current_directory():
+    """
+    获取当前文件所在目录
+    """
+    try:
+        current_dir = os.path.dirname(os.path.abspath(__file__))
+    except NameError:
+        current_dir = os.getcwd()
+        if not os.path.exists(os.path.join(current_dir, CONFIG['csv_filename'])):
+            parent_dir = os.path.dirname(current_dir)
+            future_dir = os.path.join(parent_dir, 'future')
+            if os.path.exists(os.path.join(future_dir, CONFIG['csv_filename'])):
+                current_dir = future_dir
+    return current_dir
+
+
+def read_transaction_data(csv_path):
+    """
+    读取交易记录CSV文件
+    """
+    encodings = ['utf-8-sig', 'utf-8', 'gbk', 'gb2312', 'gb18030', 'latin1']
+
+    for encoding in encodings:
+        try:
+            df = pd.read_csv(csv_path, encoding=encoding)
+            print(f"成功使用 {encoding} 编码读取CSV文件,共 {len(df)} 条记录")
+            return df
+        except UnicodeDecodeError:
+            continue
+        except Exception as e:
+            if encoding == encodings[-1]:
+                print(f"读取CSV文件时出错: {str(e)}")
+                raise
+
+    return pd.DataFrame()
+
+
+def extract_contract_info(row):
+    """
+    从交易记录中提取合约信息和交易信息
+    """
+    try:
+        # 提取合约编号
+        target_str = str(row['标的'])
+        match = re.search(r'\(([^)]+)\)', target_str)
+        if match:
+            contract_code = match.group(1)
+        else:
+            return None, None, None, None, None, None, None, None, None
+
+        # 提取日期
+        date_str = str(row['日期']).strip()
+        date_formats = ['%Y-%m-%d', '%d/%m/%Y', '%Y/%m/%d', '%d-%m-%Y', '%Y%m%d']
+
+        trade_date = None
+        for date_format in date_formats:
+            try:
+                trade_date = datetime.strptime(date_str, date_format).date()
+                break
+            except ValueError:
+                continue
+
+        if trade_date is None:
+            return None, None, None, None, None, None, None, None, None
+
+        # 提取委托时间
+        order_time_str = str(row['委托时间']).strip()
+        try:
+            time_parts = order_time_str.split(':')
+            hour = int(time_parts[0])
+
+            # 如果委托时间 >= 21:00,需要找到下一个交易日
+            if hour >= 21:
+                try:
+                    trade_days = get_trade_days(start_date=trade_date, count=2)
+                    if len(trade_days) >= 2:
+                        next_trade_day = trade_days[1]
+                        if isinstance(next_trade_day, datetime):
+                            actual_trade_date = next_trade_day.date()
+                        elif isinstance(next_trade_day, date):
+                            actual_trade_date = next_trade_day
+                        else:
+                            actual_trade_date = trade_date
+                    else:
+                        actual_trade_date = trade_date
+                except:
+                    actual_trade_date = trade_date
+            else:
+                actual_trade_date = trade_date
+        except:
+            actual_trade_date = trade_date
+
+        # 提取成交价
+        try:
+            trade_price = float(row['成交价'])
+        except:
+            return None, None, None, None, None, None, None, None, None
+
+        # 提取交易方向和类型
+        trade_type = str(row['交易类型']).strip()
+        if '开多' in trade_type:
+            direction = 'long'
+            action = 'open'
+        elif '开空' in trade_type:
+            direction = 'short'
+            action = 'open'
+        elif '平多' in trade_type or '平空' in trade_type:
+            action = 'close'
+            direction = 'long' if '平多' in trade_type else 'short'
+        else:
+            return None, None, None, None, None, None, None, None, None
+
+        # 提取交易对ID和连续交易对ID
+        trade_pair_id = row.get('交易对ID', 'N/A')
+        continuous_pair_id = row.get('连续交易对ID', 'N/A')
+
+        return contract_code, actual_trade_date, trade_price, direction, action, order_time_str, trade_type, trade_pair_id, continuous_pair_id
+
+    except Exception as e:
+        print(f"提取合约信息时出错: {str(e)}")
+        return None, None, None, None, None, None, None, None, None
+
+
+def get_trade_day_range(trade_date, days_before, days_after):
+    """
+    获取交易日范围
+    """
+    try:
+        # 获取历史交易日
+        trade_days_before = get_trade_days(end_date=trade_date, count=days_before + 1)
+        if len(trade_days_before) < days_before + 1:
+            return None, None
+
+        first_day = trade_days_before[0]
+        if isinstance(first_day, datetime):
+            start_date = first_day.date()
+        elif isinstance(first_day, date):
+            start_date = first_day
+        else:
+            start_date = first_day
+
+        # 获取未来交易日
+        trade_days_after = get_trade_days(start_date=trade_date, count=days_after + 1)
+        if len(trade_days_after) < days_after + 1:
+            return None, None
+
+        last_day = trade_days_after[-1]
+        if isinstance(last_day, datetime):
+            end_date = last_day.date()
+        elif isinstance(last_day, date):
+            end_date = last_day
+        else:
+            end_date = last_day
+
+        return start_date, end_date
+
+    except Exception as e:
+        print(f"计算交易日范围时出错: {str(e)}")
+        return None, None
+
+
+def get_kline_data_with_future(contract_code, trade_date, days_before=100, days_after=20):
+    """
+    获取包含历史和未来数据的K线数据
+    """
+    try:
+        # 获取完整的数据范围
+        start_date, end_date = get_trade_day_range(trade_date, days_before, days_after)
+        if start_date is None or end_date is None:
+            return None, None
+
+        # 获取K线数据
+        price_data = get_price(
+            contract_code,
+            start_date=start_date,
+            end_date=end_date,
+            frequency='1d',
+            fields=['open', 'close', 'high', 'low']
+        )
+
+        if price_data is None or len(price_data) == 0:
+            return None, None
+
+        # 若该标的存在开收高低完全一致的日期,则视为异常并直接过滤
+        same_all_mask = (
+            (price_data['close'] == price_data['open']) &
+            (price_data['close'] == price_data['high']) &
+            (price_data['close'] == price_data['low'])
+        )
+        if same_all_mask.any():
+            print(f"{contract_code} 数据异常:存在开盘、收盘、最高、最低完全一致的交易日,跳过该标的。")
+            return None, None
+
+        # 计算均线
+        price_data['ma5'] = price_data['close'].rolling(window=5).mean()
+        price_data['ma10'] = price_data['close'].rolling(window=10).mean()
+        price_data['ma20'] = price_data['close'].rolling(window=20).mean()
+        price_data['ma30'] = price_data['close'].rolling(window=30).mean()
+
+        # 找到交易日在数据中的位置
+        trade_date_normalized = pd.Timestamp(trade_date)
+        trade_idx = None
+
+        for i, idx in enumerate(price_data.index):
+            if isinstance(idx, pd.Timestamp):
+                if idx.date() == trade_date:
+                    trade_idx = i
+                    break
+
+        return price_data, trade_idx
+
+    except Exception as e:
+        print(f"获取K线数据时出错: {str(e)}")
+        return None, None
+
+
+def plot_partial_kline(data, trade_idx, trade_price, direction, contract_code, trade_date, order_time, save_path=None):
+    # contract_code 参数保留用于可能的扩展功能
+    """
+    绘制部分K线图(仅显示历史数据和当天)
+    """
+    try:
+        # 截取历史数据和当天数据
+        partial_data = data.iloc[:trade_idx + 1].copy()
+
+        # 根据交易方向修改当天的价格数据
+        if direction == 'long':
+            # 做多时,用成交价替代最高价(表示买入点)
+            partial_data.iloc[-1, partial_data.columns.get_loc('close')] = trade_price
+            partial_data.iloc[-1, partial_data.columns.get_loc('high')] = trade_price
+        else:
+            # 做空时,用成交价替代最低价(表示卖出点)
+            partial_data.iloc[-1, partial_data.columns.get_loc('close')] = trade_price
+            partial_data.iloc[-1, partial_data.columns.get_loc('low')] = trade_price
+
+        fig, ax = plt.subplots(figsize=(16, 10))
+
+        # 准备数据
+        dates = partial_data.index
+        opens = partial_data['open']
+        highs = partial_data['high']
+        lows = partial_data['low']
+        closes = partial_data['close']
+
+        # 绘制K线
+        for i in range(len(partial_data)):
+            # 检查是否是交易日
+            is_trade_day = (i == trade_idx)
+
+            if is_trade_day:
+                # 成交日根据涨跌用不同颜色
+                if closes.iloc[i] > opens.iloc[i]:  # 上涨
+                    color = '#FFD700'  # 金黄色(黄红色混合)
+                    edge_color = '#FF8C00'  # 深橙色
+                else:  # 下跌
+                    color = '#ADFF2F'  # 黄绿色
+                    edge_color = '#9ACD32'  # 黄绿色深版
+            else:
+                # 正常K线颜色
+                color = 'red' if closes.iloc[i] > opens.iloc[i] else 'green'
+                edge_color = 'darkred' if closes.iloc[i] > opens.iloc[i] else 'darkgreen'
+
+            # 影线
+            ax.plot([i, i], [lows.iloc[i], highs.iloc[i]], color='black', linewidth=1)
+
+            # 实体
+            body_height = abs(closes.iloc[i] - opens.iloc[i])
+            if body_height == 0:
+                body_height = 0.01
+            bottom = min(opens.iloc[i], closes.iloc[i])
+
+            rect = patches.Rectangle((i-0.4, bottom), 0.8, body_height,
+                                   linewidth=1, edgecolor=edge_color,
+                                   facecolor=color, alpha=0.8)
+            ax.add_patch(rect)
+
+        # 绘制均线
+        ax.plot(range(len(partial_data)), partial_data['ma5'], label='MA5', color='blue', linewidth=1.5, alpha=0.8)
+        ax.plot(range(len(partial_data)), partial_data['ma10'], label='MA10', color='orange', linewidth=1.5, alpha=0.8)
+        ax.plot(range(len(partial_data)), partial_data['ma20'], label='MA20', color='purple', linewidth=1.5, alpha=0.8)
+        ax.plot(range(len(partial_data)), partial_data['ma30'], label='MA30', color='brown', linewidth=1.5, alpha=0.8)
+
+        # 获取当天的最高价(用于画连接线)
+        day_high = highs.iloc[trade_idx]
+
+        # 标注信息
+        date_label = trade_date.strftime('%Y-%m-%d')
+        price_label = f'Price: {trade_price:.2f}'
+        direction_label = f'Direction: {"Long" if direction == "long" else "Short"}'
+        time_label = f'Time: {order_time}'
+
+        # 将文本框移到左上角
+        annotation_text = f'{date_label}\n{price_label}\n{direction_label}\n{time_label}'
+        text_box = ax.text(0.02, 0.98, annotation_text,
+               fontsize=10, ha='left', va='top', transform=ax.transAxes,
+               bbox=dict(boxstyle='round,pad=0.6', facecolor='yellow', alpha=0.9, edgecolor='black', linewidth=1.5),
+               zorder=11, weight='bold')
+
+        # 画黄色虚线连接文本框底部和交易日最高价
+        # 获取文本框在数据坐标系中的位置
+        fig.canvas.draw()  # 需要先绘制一次才能获取准确位置
+        bbox = text_box.get_window_extent().transformed(ax.transData.inverted())
+        text_bottom_y = bbox.ymin
+
+        # 从文本框底部到交易日最高价画虚线
+        ax.plot([trade_idx, trade_idx], [day_high, text_bottom_y],
+               color='yellow', linestyle='--', linewidth=1.5, alpha=0.7, zorder=5)
+
+        # 设置标题和标签
+        direction_text = "Long" if direction == "long" else "Short"
+        ax.set_title(f'{direction_text} Position Decision\n'
+                    f'Historical Data + Trade Day Only',
+                    fontsize=14, fontweight='bold', pad=20)
+
+        ax.set_xlabel('Time', fontsize=12)
+        ax.set_ylabel('Price', fontsize=12)
+        ax.grid(True, alpha=0.3)
+        ax.legend(loc='lower left', fontsize=10)
+
+        # 设置x轴标签
+        step = max(1, len(partial_data) // 10)
+        tick_positions = range(0, len(partial_data), step)
+        tick_labels = []
+        for pos in tick_positions:
+            date_val = dates[pos]
+            if isinstance(date_val, (date, datetime)):
+                tick_labels.append(date_val.strftime('%Y-%m-%d'))
+            else:
+                tick_labels.append(str(date_val))
+
+        ax.set_xticks(tick_positions)
+        ax.set_xticklabels(tick_labels, rotation=45, ha='right')
+
+        plt.tight_layout()
+
+        if save_path:
+            plt.savefig(save_path, dpi=CONFIG['plot_dpi'], bbox_inches='tight')
+
+        if CONFIG['show_plots']:
+            plt.show()
+
+        plt.close(fig)
+
+    except Exception as e:
+        print(f"绘制部分K线图时出错: {str(e)}")
+        plt.close('all')
+        raise
+
+
+def plot_full_kline(data, trade_idx, trade_price, direction, contract_code, trade_date, order_time, profit_loss, save_path=None):
+    """
+    绘制完整K线图(包含未来数据)
+    """
+    try:
+        fig, ax = plt.subplots(figsize=(16, 10))
+
+        # 准备数据
+        dates = data.index
+        opens = data['open']
+        highs = data['high']
+        lows = data['low']
+        closes = data['close']
+
+        # 绘制K线
+        for i in range(len(data)):
+            # 检查是否是交易日
+            is_trade_day = (i == trade_idx)
+
+            if is_trade_day:
+                # 成交日根据涨跌用不同颜色
+                if closes.iloc[i] > opens.iloc[i]:  # 上涨
+                    color = '#FFD700'  # 金黄色(黄红色混合)
+                    edge_color = '#FF8C00'  # 深橙色
+                else:  # 下跌
+                    color = '#ADFF2F'  # 黄绿色
+                    edge_color = '#9ACD32'  # 黄绿色深版
+            else:
+                # 正常K线颜色
+                color = 'red' if closes.iloc[i] > opens.iloc[i] else 'green'
+                edge_color = 'darkred' if closes.iloc[i] > opens.iloc[i] else 'darkgreen'
+
+            # 影线
+            ax.plot([i, i], [lows.iloc[i], highs.iloc[i]], color='black', linewidth=1)
+
+            # 实体
+            body_height = abs(closes.iloc[i] - opens.iloc[i])
+            if body_height == 0:
+                body_height = 0.01
+            bottom = min(opens.iloc[i], closes.iloc[i])
+
+            rect = patches.Rectangle((i-0.4, bottom), 0.8, body_height,
+                                   linewidth=1, edgecolor=edge_color,
+                                   facecolor=color, alpha=0.8)
+            ax.add_patch(rect)
+
+        # 绘制均线
+        ax.plot(range(len(data)), data['ma5'], label='MA5', color='blue', linewidth=1.5, alpha=0.8)
+        ax.plot(range(len(data)), data['ma10'], label='MA10', color='orange', linewidth=1.5, alpha=0.8)
+        ax.plot(range(len(data)), data['ma20'], label='MA20', color='purple', linewidth=1.5, alpha=0.8)
+        ax.plot(range(len(data)), data['ma30'], label='MA30', color='brown', linewidth=1.5, alpha=0.8)
+
+        # 获取当天的最高价(用于画连接线)
+        day_high = highs.iloc[trade_idx]
+
+        # 添加未来区域背景
+        ax.axvspan(trade_idx + 0.5, len(data) - 0.5, alpha=0.1, color='gray', label='Future Data')
+
+        # 标注信息
+        date_label = trade_date.strftime('%Y-%m-%d')
+        price_label = f'Price: {trade_price:.2f}'
+        direction_label = f'Direction: {"Long" if direction == "long" else "Short"}'
+        time_label = f'Time: {order_time}'
+        profit_label = f'P&L: {profit_loss:+.2f}'
+
+        # 将文本框移到左上角
+        annotation_text = f'{date_label}\n{price_label}\n{direction_label}\n{time_label}\n{profit_label}'
+        text_box = ax.text(0.02, 0.98, annotation_text,
+               fontsize=10, ha='left', va='top', transform=ax.transAxes,
+               bbox=dict(boxstyle='round,pad=0.6', facecolor='yellow', alpha=0.9, edgecolor='black', linewidth=1.5),
+               zorder=11, weight='bold')
+
+        # 画黄色虚线连接文本框底部和交易日最高价
+        # 获取文本框在数据坐标系中的位置
+        fig.canvas.draw()  # 需要先绘制一次才能获取准确位置
+        bbox = text_box.get_window_extent().transformed(ax.transData.inverted())
+        text_bottom_y = bbox.ymin
+
+        # 从文本框底部到交易日最高价画虚线
+        ax.plot([trade_idx, trade_idx], [day_high, text_bottom_y],
+               color='yellow', linestyle='--', linewidth=1.5, alpha=0.7, zorder=5)
+
+        # 设置标题和标签
+        contract_simple = contract_code.split('.')[0]
+        direction_text = "Long" if direction == "long" else "Short"
+        ax.set_title(f'{contract_simple} - {direction_text} Position Result\n'
+                    f'Complete Data with Future {CONFIG["future_days"]} Days',
+                    fontsize=14, fontweight='bold', pad=20)
+
+        ax.set_xlabel('Time', fontsize=12)
+        ax.set_ylabel('Price', fontsize=12)
+        ax.grid(True, alpha=0.3)
+        ax.legend(loc='lower left', fontsize=10)
+
+        # 设置x轴标签
+        step = max(1, len(data) // 15)
+        tick_positions = range(0, len(data), step)
+        tick_labels = []
+        for pos in tick_positions:
+            date_val = dates[pos]
+            if isinstance(date_val, (date, datetime)):
+                tick_labels.append(date_val.strftime('%Y-%m-%d'))
+            else:
+                tick_labels.append(str(date_val))
+
+        ax.set_xticks(tick_positions)
+        ax.set_xticklabels(tick_labels, rotation=45, ha='right')
+
+        plt.tight_layout()
+
+        if save_path:
+            plt.savefig(save_path, dpi=CONFIG['plot_dpi'], bbox_inches='tight')
+
+        if CONFIG['show_plots']:
+            plt.show()
+
+        plt.close(fig)
+
+    except Exception as e:
+        print(f"绘制完整K线图时出错: {str(e)}")
+        plt.close('all')
+        raise
+
+
+def load_processed_results(result_path):
+    """
+    加载已处理的结果文件
+    """
+    if not os.path.exists(result_path):
+        return pd.DataFrame(), set()
+
+    try:
+        # 简单读取CSV文件
+        df = pd.read_csv(result_path, header=0)
+
+        # 确保必要的列存在
+        required_columns = ['交易对ID']
+        for col in required_columns:
+            if col not in df.columns:
+                print(f"警告:结果文件缺少必要列 '{col}'")
+                return pd.DataFrame(), set()
+
+        # 获取已处理的交易对ID
+        processed_pairs = set(df['交易对ID'].dropna().unique())
+        return df, processed_pairs
+
+    except Exception as e:
+        # 详细打印错误信息
+        print(f"加载结果文件时出错: {str(e)}")
+        print(f"错误类型: {type(e)}")
+
+        return pd.DataFrame(), set()
+
+
+def calculate_profit_loss(df, trade_pair_id, continuous_pair_id):
+    """
+    计算平仓盈亏
+    """
+    try:
+        if continuous_pair_id != 'N/A' and pd.notna(continuous_pair_id):
+            # 合并所有同一连续交易对ID的平仓盈亏
+            close_trades = df[
+                (df['连续交易对ID'] == continuous_pair_id) &
+                (df['交易类型'].str[0] == '平')
+            ]
+            total_profit = close_trades['平仓盈亏'].sum()
+        else:
+            # 只查找当前交易对ID的平仓交易
+            close_trades = df[
+                (df['交易对ID'] == trade_pair_id) &
+                (df['交易类型'].str[0] == '平')
+            ]
+            if len(close_trades) > 0:
+                total_profit = close_trades['平仓盈亏'].iloc[0]
+            else:
+                total_profit = 0
+
+        return total_profit
+
+    except Exception as e:
+        print(f"计算盈亏时出错: {str(e)}")
+        return 0
+
+
+def record_result(result_data, result_path):
+    """
+    记录训练结果
+    """
+    try:
+        # 创建结果DataFrame
+        result_df = pd.DataFrame([result_data])
+
+        # 如果文件已存在,读取现有格式并确保新数据格式一致
+        if os.path.exists(result_path):
+            try:
+                # 读取现有文件的列名
+                existing_df = pd.read_csv(result_path, nrows=0)  # 只读取列名
+                existing_columns = existing_df.columns.tolist()
+
+                # 如果新数据列与现有文件不一致,调整格式
+                if list(result_df.columns) != existing_columns:
+                    # 重新创建DataFrame,确保列顺序一致
+                    aligned_data = {}
+                    for col in existing_columns:
+                        aligned_data[col] = result_data.get(col, 'N/A' if col == '连续交易总盈亏' else '')
+                    result_df = pd.DataFrame([aligned_data])
+
+                # 追加写入
+                result_df.to_csv(result_path, mode='a', header=False, index=False, encoding='utf-8-sig')
+            except Exception:
+                # 如果无法读取现有格式,直接覆盖
+                result_df.to_csv(result_path, mode='w', header=True, index=False, encoding='utf-8-sig')
+        else:
+            # 文件不存在,创建新文件
+            result_df.to_csv(result_path, mode='w', header=True, index=False, encoding='utf-8-sig')
+
+        print(f"结果已记录到: {result_path}")
+
+    except Exception as e:
+        print(f"记录结果时出错: {str(e)}")
+
+
+def is_first_continuous_trade(transaction_df, trade_pair_id, continuous_pair_id):
+    """
+    判断是否为连续交易的第一笔交易
+
+    参数:
+        transaction_df: 交易数据DataFrame
+        trade_pair_id: 当前交易对ID
+        continuous_pair_id: 连续交易对ID
+
+    返回:
+        bool: 是否为连续交易的第一笔交易(或不是连续交易)
+    """
+    # 如果不是连续交易,返回True
+    if continuous_pair_id == 'N/A' or pd.isna(continuous_pair_id):
+        return True
+
+    # 获取同一连续交易组的所有交易
+    continuous_trades = transaction_df[transaction_df['连续交易对ID'] == continuous_pair_id]
+
+    # 获取所有交易对ID并按时间排序
+    pair_ids = continuous_trades['交易对ID'].unique()
+
+    # 获取每个交易对的开仓时间
+    pair_times = []
+    for pid in pair_ids:
+        pair_records = continuous_trades[continuous_trades['交易对ID'] == pid]
+        open_records = pair_records[pair_records['交易类型'].str.contains('开', na=False)]
+        if len(open_records) > 0:
+            # 获取第一个开仓记录的日期和时间
+            first_open = open_records.iloc[0]
+            date_str = str(first_open['日期']).strip()
+            time_str = str(first_open['委托时间']).strip()
+            try:
+                dt = pd.to_datetime(f"{date_str} {time_str}")
+                pair_times.append((pid, dt))
+            except:
+                pass
+
+    # 按时间排序
+    pair_times.sort(key=lambda x: x[1])
+
+    # 检查当前交易对是否为第一个
+    if pair_times and pair_times[0][0] == trade_pair_id:
+        return True
+
+    return False
+
+
+def get_user_decision():
+    """
+    获取用户的开仓决策和信心指数
+    
+    返回:
+        tuple: (是否开仓, 信心指数)
+            - 是否开仓: bool
+            - 信心指数: int (1-3)
+    """
+    while True:
+        decision = input("\n是否开仓?请输入 'y,信心指数' (开仓) 或 'n,信心指数' (不开仓)\n" +
+                       "例如: 'y,3' (开仓,高信心) 或 'n,1' (不开仓,低信心)\n" +
+                       "信心指数: 1=低, 2=中, 3=高 (默认为2): ").strip().lower()
+        
+        # 解析输入
+        parts = decision.split(',')
+        decision_part = parts[0].strip()
+        confidence = 2  # 默认信心指数
+        
+        # 检查是否提供了信心指数
+        if len(parts) >= 2:
+            try:
+                confidence = int(parts[1].strip())
+                if confidence not in [1, 2, 3]:
+                    print("信心指数必须是 1、2 或 3,请重新输入")
+                    continue
+            except ValueError:
+                print("信心指数必须是数字 1、2 或 3,请重新输入")
+                continue
+        
+        # 检查开仓决策
+        if decision_part in ['y', 'yes', '是', '开仓']:
+            return True, confidence
+        elif decision_part in ['n', 'no', '否', '不开仓']:
+            return False, confidence
+        else:
+            print("请输入有效的选项: 'y' 或 'n' (可选择性添加信心指数,如 'y,3')")
+
+
+def main():
+    """
+    主函数
+    """
+    print("=" * 60)
+    print("交易训练工具")
+    print("=" * 60)
+
+    # 设置随机种子
+    if CONFIG['random_seed'] is not None:
+        random.seed(CONFIG['random_seed'])
+        np.random.seed(CONFIG['random_seed'])
+
+    # 获取当前目录
+    current_dir = _get_current_directory()
+    csv_path = os.path.join(current_dir, CONFIG['csv_filename'])
+    result_path = os.path.join(current_dir, CONFIG['result_filename'])
+    output_dir = os.path.join(current_dir, CONFIG['output_dir'])
+
+    # 创建输出目录
+    os.makedirs(output_dir, exist_ok=True)
+
+    # 1. 读取交易数据
+    print("\n=== 步骤1: 读取交易数据 ===")
+    transaction_df = read_transaction_data(csv_path)
+    if len(transaction_df) == 0:
+        print("未能读取交易数据,退出")
+        return
+
+    # 2. 加载已处理的结果
+    print("\n=== 步骤2: 加载已处理记录 ===")
+    _, processed_pairs = load_processed_results(result_path)
+    print(f"已处理 {len(processed_pairs)} 个交易对")
+
+    # 3. 提取所有开仓交易
+    print("\n=== 步骤3: 提取开仓交易 ===")
+    open_trades = []
+
+    for idx, row in transaction_df.iterrows():
+        contract_code, trade_date, trade_price, direction, action, order_time, trade_type, trade_pair_id, continuous_pair_id = extract_contract_info(row)
+
+        if contract_code is None or action != 'open':
+            continue
+
+        # 跳过已处理的交易对
+        if trade_pair_id in processed_pairs:
+            continue
+
+        # 检查是否为连续交易的第一笔交易(如果不是第一笔,跳过)
+        if not is_first_continuous_trade(transaction_df, trade_pair_id, continuous_pair_id):
+            continue
+
+        # 查找对应的平仓交易
+        profit_loss = calculate_profit_loss(transaction_df, trade_pair_id, continuous_pair_id)
+
+        # 如果是连续交易,获取连续交易总盈亏
+        continuous_total_profit = 'N/A'
+        if continuous_pair_id != 'N/A' and pd.notna(continuous_pair_id):
+            continuous_trades = transaction_df[transaction_df['连续交易对ID'] == continuous_pair_id]
+            try:
+                close_profit_loss_str = continuous_trades['平仓盈亏'].astype(str).str.replace(',', '')
+                close_profit_loss_numeric = pd.to_numeric(close_profit_loss_str, errors='coerce').fillna(0)
+                continuous_total_profit = close_profit_loss_numeric.sum()
+            except:
+                continuous_total_profit = 0
+
+        open_trades.append({
+            'index': idx,
+            'contract_code': contract_code,
+            'trade_date': trade_date,
+            'trade_price': trade_price,
+            'direction': direction,
+            'order_time': order_time,
+            'trade_type': trade_type,
+            'trade_pair_id': trade_pair_id,
+            'continuous_pair_id': continuous_pair_id,
+            'profit_loss': profit_loss,
+            'continuous_total_profit': continuous_total_profit,
+            'original_row': row
+        })
+
+    print(f"找到 {len(open_trades)} 个未处理的开仓交易(已过滤非首笔连续交易)")
+
+    if len(open_trades) == 0:
+        print("没有未处理的开仓交易,退出")
+        return
+
+    # 4. 构建候选交易列表(按标的类型分组轮询,避免同类集中)
+    print("\n=== 步骤4: 构建候选交易列表 ===")
+
+    # 按标的类型分组(提取合约代码的核心字母部分)
+    def get_contract_type(contract_code):
+        """提取合约类型,如'M2405'提取为'M','AG2406'提取为'AG'"""
+        import re
+        match = re.match(r'^([A-Za-z]+)', contract_code.split('.')[0])
+        return match.group(1) if match else 'UNKNOWN'
+
+    # 按合约类型分组
+    trades_by_type = {}
+    for trade in open_trades:
+        contract_type = get_contract_type(trade['contract_code'])
+        if contract_type not in trades_by_type:
+            trades_by_type[contract_type] = []
+        trades_by_type[contract_type].append(trade)
+
+    # 打乱每个组内的顺序
+    for contract_type in trades_by_type:
+        random.shuffle(trades_by_type[contract_type])
+
+    def build_trade_queue(trade_groups):
+        type_order = list(trade_groups.keys())
+        random.shuffle(type_order)
+        queue = []
+        while True:
+            added = False
+            for contract_type in type_order:
+                if trade_groups[contract_type]:
+                    queue.append(trade_groups[contract_type].pop(0))
+                    added = True
+            if not added:
+                break
+        return queue
+
+    trade_queue = build_trade_queue(trades_by_type)
+    print(f"候选交易数量: {len(trade_queue)}")
+
+    # 5. 依次尝试获取K线数据,若失败则自动尝试下一候选
+    print("\n=== 步骤5: 获取K线数据 ===")
+    selected_trade = None
+    kline_data = None
+    trade_idx = None
+
+    for i, candidate_trade in enumerate(trade_queue):
+        print(f"尝试候选 {i + 1}/{len(trade_queue)}: {candidate_trade['contract_code']} {candidate_trade['trade_date']}")
+        kline_data, trade_idx = get_kline_data_with_future(
+            candidate_trade['contract_code'],
+            candidate_trade['trade_date'],
+            CONFIG['history_days'],
+            CONFIG['future_days']
+        )
+
+        if kline_data is None or trade_idx is None:
+            print("获取K线数据失败,尝试下一个候选。")
+            continue
+
+        selected_trade = candidate_trade
+        remaining = len(trade_queue) - (i + 1)
+        print(f"成功获取K线数据,剩余候选 {remaining} 个")
+        break
+
+    if selected_trade is None:
+        print("所有候选交易均无法获取有效K线数据,退出")
+        return
+
+    # 6. 显示部分K线图
+    print("\n=== 步骤6: 显示部分K线图 ===")
+    partial_image_name = f"partial_{selected_trade['contract_code']}_{selected_trade['trade_date']}_{selected_trade['direction']}.png"
+    partial_image_path = os.path.join(output_dir, partial_image_name)
+    plot_partial_kline(
+        kline_data, trade_idx, selected_trade['trade_price'],
+        selected_trade['direction'], selected_trade['contract_code'],
+        selected_trade['trade_date'], selected_trade['order_time'],
+        partial_image_path
+    )
+
+    # 7. 获取用户决策和信心指数
+    user_decision, confidence_level = get_user_decision()
+
+    # 8. 显示完整K线图
+    print("\n=== 步骤7: 显示完整K线图 ===")
+    full_image_name = f"full_{selected_trade['contract_code']}_{selected_trade['trade_date']}_{selected_trade['direction']}.png"
+    full_image_path = os.path.join(output_dir, full_image_name)
+    plot_full_kline(
+        kline_data, trade_idx, selected_trade['trade_price'],
+        selected_trade['direction'], selected_trade['contract_code'],
+        selected_trade['trade_date'], selected_trade['order_time'],
+        selected_trade['profit_loss'],
+        full_image_path
+    )
+
+    # 在完整K线图之后显示交易信息
+    print(f"\n交易信息:")
+    print(f"合约: {selected_trade['contract_code']}")
+    print(f"日期: {selected_trade['trade_date']}")
+    print(f"方向: {'多头' if selected_trade['direction'] == 'long' else '空头'}")
+    print(f"成交价: {selected_trade['trade_price']}")
+
+    # 9. 记录结果
+    print("\n=== 步骤8: 记录结果 ===")
+
+    # 计算判定收益(使用连续交易总盈亏或普通盈亏)
+    if selected_trade['continuous_total_profit'] != 'N/A':
+        # 连续交易使用连续交易总盈亏
+        decision_profit = selected_trade['continuous_total_profit'] if user_decision else -selected_trade['continuous_total_profit']
+        profit_to_show = selected_trade['continuous_total_profit']
+    else:
+        # 普通交易使用单笔盈亏
+        decision_profit = selected_trade['profit_loss'] if user_decision else -selected_trade['profit_loss']
+        profit_to_show = selected_trade['profit_loss']
+
+    result_data = {
+        '日期': selected_trade['original_row']['日期'],
+        '委托时间': selected_trade['original_row']['委托时间'],
+        '标的': selected_trade['original_row']['标的'],
+        '交易类型': selected_trade['original_row']['交易类型'],
+        '成交数量': selected_trade['original_row']['成交数量'],
+        '成交价': selected_trade['original_row']['成交价'],
+        '平仓盈亏': selected_trade['profit_loss'],
+        '用户判定': '开仓' if user_decision else '不开仓',
+        '信心指数': confidence_level,
+        '判定收益': decision_profit,
+        '交易对ID': selected_trade['trade_pair_id'],
+        '连续交易对ID': selected_trade['continuous_pair_id'],
+        '连续交易总盈亏': selected_trade['continuous_total_profit']
+    }
+
+    record_result(result_data, result_path)
+
+    print(f"\n=== 训练完成 ===")
+    print(f"用户判定: {'开仓' if user_decision else '不开仓'}")
+    print(f"信心指数: {confidence_level} ({'低' if confidence_level == 1 else '中' if confidence_level == 2 else '高'})")
+    if selected_trade['continuous_total_profit'] != 'N/A':
+        print(f"连续交易总盈亏: {profit_to_show:+.2f}")
+    else:
+        print(f"实际盈亏: {profit_to_show:+.2f}")
+    print(f"判定收益: {decision_profit:+.2f}")
+    print(f"结果已保存到: {result_path}")
+
+
+if __name__ == "__main__":
+    main()

+ 868 - 0
Lib/future/transaction_pair_analysis.py

@@ -0,0 +1,868 @@
+# 交易配对分析工具
+# 用于从交易记录CSV文件中为开仓/平仓交易进行配对,为每对关联交易分配相同ID
+
+import pandas as pd
+import numpy as np
+from datetime import datetime
+import re
+import os
+import warnings
+warnings.filterwarnings('ignore')
+
+
+def _get_current_directory():
+    """
+    获取当前文件所在目录
+    
+    返回:
+        str: 当前目录路径
+    """
+    try:
+        current_dir = os.path.dirname(os.path.abspath(__file__))
+    except NameError:
+        current_dir = os.getcwd()
+    return current_dir
+
+
+def read_transaction_csv(csv_path):
+    """
+    读取交易记录CSV文件,支持多种编码格式
+    
+    参数:
+        csv_path (str): CSV文件路径
+    
+    返回:
+        pandas.DataFrame: 包含交易记录的DataFrame
+    """
+    encodings = ['gbk', 'utf-8-sig', 'utf-8', 'gb2312', 'gb18030', 'latin1']
+    
+    for encoding in encodings:
+        try:
+            df = pd.read_csv(csv_path, encoding=encoding)
+            print(f"成功使用 {encoding} 编码读取CSV文件")
+            print(f"从CSV文件中读取到 {len(df)} 条记录")
+            return df
+        except UnicodeDecodeError:
+            continue
+        except Exception as e:
+            if encoding == encodings[-1]:
+                print(f"读取CSV文件时出错: {str(e)}")
+                raise
+            continue
+    
+    print(f"无法使用任何编码格式读取CSV文件: {csv_path}")
+    return pd.DataFrame()
+
+
+def parse_transaction_data(df):
+    """
+    解析交易数据,提取关键信息
+    
+    参数:
+        df (pandas.DataFrame): 原始交易数据
+    
+    返回:
+        pandas.DataFrame: 添加了解析字段的DataFrame
+    """
+    df = df.copy()
+    
+    # 提取标的(完整信息用于匹配)
+    df['标的_完整'] = df['标的'].astype(str).str.strip()
+    
+    # 提取交易类型
+    df['交易类型_标准'] = df['交易类型'].astype(str).str.strip()
+    
+    # 判断是开仓还是平仓
+    df['仓位操作'] = df['交易类型_标准'].apply(lambda x: '开仓' if '开' in x else ('平仓' if '平' in x else '未知'))
+    
+    # 判断方向(多/空)
+    df['方向'] = df['交易类型_标准'].apply(lambda x: '多' if '多' in x else ('空' if '空' in x else '未知'))
+    
+    # 从成交数量中提取数字(去掉"手"等单位)
+    def extract_quantity(qty_str):
+        """从成交数量字符串中提取数字"""
+        try:
+            qty_str = str(qty_str).strip()
+            # 使用正则提取数字(包括负号和小数点)
+            match = re.search(r'(-?\d+(?:\.\d+)?)', qty_str)
+            if match:
+                return abs(float(match.group(1)))  # 返回绝对值
+            return 0
+        except:
+            return 0
+    
+    df['成交数量_数值'] = df['成交数量'].apply(extract_quantity)
+    
+    # 合并日期和时间为完整时间戳用于排序
+    def parse_datetime(row):
+        """解析日期和时间"""
+        try:
+            date_str = str(row['日期']).strip()
+            time_str = str(row['委托时间']).strip()
+            datetime_str = f"{date_str} {time_str}"
+            return pd.to_datetime(datetime_str)
+        except:
+            return pd.NaT
+    
+    df['交易时间'] = df.apply(parse_datetime, axis=1)
+    
+    # 过滤掉无效记录
+    df = df[df['成交数量_数值'] > 0].copy()
+    df = df[df['仓位操作'] != '未知'].copy()
+    df = df[df['方向'] != '未知'].copy()
+    df = df[~df['交易时间'].isna()].copy()
+    
+    print(f"解析后有效记录: {len(df)} 条")
+    
+    return df
+
+
+def fix_incomplete_pairs(df):
+    """
+    修复不完整的配对(配对ID只出现一次的情况)
+    
+    参数:
+        df (pandas.DataFrame): 已配对的交易数据
+    
+    返回:
+        pandas.DataFrame: 修复后的DataFrame
+    """
+    df = df.copy()
+    max_iterations = 10  # 最大迭代次数,防止无限循环
+    iteration = 0
+    
+    print(f"\n开始修复不完整配对...")
+    
+    while iteration < max_iterations:
+        iteration += 1
+        
+        # 统计每个配对ID出现的次数(排除"未配对"和空值)
+        paired_mask = (df['交易对ID'] != '') & (df['交易对ID'] != '未配对')
+        paired_df = df[paired_mask]
+        
+        if len(paired_df) == 0:
+            break
+        
+        # 统计每个配对ID出现的次数
+        pair_id_counts = paired_df['交易对ID'].value_counts()
+        
+        # 找出只出现一次的配对ID(不完整配对)
+        incomplete_pair_ids = pair_id_counts[pair_id_counts == 1].index.tolist()
+        
+        if len(incomplete_pair_ids) == 0:
+            print(f"  迭代 {iteration}: 没有发现不完整配对,修复完成")
+            break
+        
+        print(f"  迭代 {iteration}: 发现 {len(incomplete_pair_ids)} 个不完整配对")
+        
+        fixed_count = 0
+        
+        # 处理每个不完整配对
+        for pair_id in incomplete_pair_ids:
+            # 找到这个配对ID对应的交易
+            pair_mask = df['交易对ID'] == pair_id
+            incomplete_trade = df[pair_mask].iloc[0]
+            
+            target = incomplete_trade['标的_完整']
+            direction = incomplete_trade['方向']
+            operation = incomplete_trade['仓位操作']  # 开仓或平仓
+            trade_time = incomplete_trade['交易时间']
+            trade_qty = incomplete_trade['成交数量_数值']
+            
+            # 查找可匹配的未配对交易
+            # 条件:相同标的、相同方向、配对ID为空或"未配对"
+            unpaired_mask = (
+                (df['标的_完整'] == target) &
+                (df['方向'] == direction) &
+                ((df['交易对ID'] == '') | (df['交易对ID'] == '未配对'))  # 未配对
+            )
+            unpaired_trades = df[unpaired_mask].copy()
+            
+            if len(unpaired_trades) == 0:
+                continue
+            
+            # 根据不完整交易的类型,查找匹配的交易
+            if operation == '开仓':
+                # 如果是开仓,查找平仓交易(时间在开仓之后)
+                matching_trades = unpaired_trades[
+                    (unpaired_trades['仓位操作'] == '平仓') &
+                    (unpaired_trades['交易时间'] >= trade_time)
+                ].sort_values('交易时间')
+            else:  # operation == '平仓'
+                # 如果是平仓,查找开仓交易(时间在平仓之前)
+                matching_trades = unpaired_trades[
+                    (unpaired_trades['仓位操作'] == '开仓') &
+                    (unpaired_trades['交易时间'] <= trade_time)
+                ].sort_values('交易时间', ascending=False)  # 从后往前,优先匹配最近的
+            
+            if len(matching_trades) == 0:
+                continue
+            
+            # 使用贪心算法累加匹配的交易
+            remaining_qty = trade_qty
+            matched_indices = []
+            
+            for idx in matching_trades.index:
+                match_qty = df.loc[idx, '成交数量_数值']
+                
+                if remaining_qty <= 0:
+                    break
+                
+                if match_qty <= remaining_qty:
+                    # 这笔交易可以加入配对
+                    matched_indices.append(idx)
+                    remaining_qty -= match_qty
+                else:
+                    # 交易数量大于剩余需要量,跳过(保持精确匹配)
+                    continue
+            
+            # 如果找到精确匹配的组合(剩余数量为0或接近0)
+            if len(matched_indices) > 0 and abs(remaining_qty) < 0.01:
+                # 将匹配的交易添加到对应的配对ID中
+                for idx in matched_indices:
+                    df.loc[idx, '交易对ID'] = pair_id
+                
+                fixed_count += len(matched_indices)
+                
+                if operation == '开仓':
+                    print(f"    修复: {target} {direction} 1笔开仓({trade_qty:.0f}手) 匹配 {len(matched_indices)}笔平仓 -> {pair_id}")
+                else:
+                    print(f"    修复: {target} {direction} 1笔平仓({trade_qty:.0f}手) 匹配 {len(matched_indices)}笔开仓 -> {pair_id}")
+        
+        if fixed_count == 0:
+            # 没有修复任何配对,退出循环
+            print(f"  迭代 {iteration}: 无法修复更多配对,停止")
+            break
+        
+        print(f"  迭代 {iteration}: 修复了 {fixed_count} 笔交易")
+    
+    # 最终检查:统计不完整配对
+    paired_mask = (df['交易对ID'] != '') & (df['交易对ID'] != '未配对')
+    paired_df = df[paired_mask]
+    if len(paired_df) > 0:
+        pair_id_counts = paired_df['交易对ID'].value_counts()
+        incomplete_count = len(pair_id_counts[pair_id_counts == 1])
+        if incomplete_count > 0:
+            print(f"\n警告: 仍有 {incomplete_count} 个不完整配对无法修复")
+        else:
+            print(f"\n修复完成: 所有配对都已完整")
+    
+    return df
+
+
+def pair_transactions(df):
+    """
+    为交易进行配对,分配交易对ID
+    
+    参数:
+        df (pandas.DataFrame): 解析后的交易数据
+    
+    返回:
+        pandas.DataFrame: 添加了交易对ID的DataFrame
+    """
+    df = df.copy()
+    df['交易对ID'] = ''  # 初始化交易对ID列
+    
+    # 按交易时间排序
+    df = df.sort_values('交易时间').reset_index(drop=True)
+    
+    pair_id_counter = 1  # 交易对ID计数器
+    paired_count = 0  # 已配对交易数
+    unpaired_count = 0  # 未配对交易数
+    
+    # 按标的分组
+    grouped = df.groupby('标的_完整')
+    
+    print(f"\n开始配对,共有 {len(grouped)} 个不同标的")
+    
+    for target, group in grouped:
+        # 再按方向分组(多/空)
+        for direction in ['多', '空']:
+            direction_group = group[group['方向'] == direction].copy()
+            
+            if len(direction_group) == 0:
+                continue
+            
+            # 分离开仓和平仓交易
+            open_trades = direction_group[direction_group['仓位操作'] == '开仓'].copy()
+            close_trades = direction_group[direction_group['仓位操作'] == '平仓'].copy()
+            
+            if len(open_trades) == 0 or len(close_trades) == 0:
+                # 标记未配对的交易
+                for idx in direction_group.index:
+                    df.loc[idx, '交易对ID'] = f'未配对'
+                    unpaired_count += 1
+                continue
+            
+            # 第零阶段:优先处理同一时间的交易(特别是数量相等的1开1平)
+            # 按时间分组,处理同一时间点的开仓和平仓
+            time_groups = direction_group.groupby('交易时间')
+            for time_key, time_group in time_groups:
+                time_open_trades = time_group[time_group['仓位操作'] == '开仓'].copy()
+                time_close_trades = time_group[time_group['仓位操作'] == '平仓'].copy()
+                
+                if len(time_open_trades) == 0 or len(time_close_trades) == 0:
+                    continue
+                
+                # 优先匹配数量完全相等的1开1平
+                for close_idx in time_close_trades.index:
+                    if df.loc[close_idx, '交易对ID'] != '':  # 已配对的平仓交易跳过
+                        continue
+                    
+                    close_qty = df.loc[close_idx, '成交数量_数值']
+                    
+                    # 查找同一时间、数量相等的未配对开仓
+                    matching_open = time_open_trades[
+                        (time_open_trades['成交数量_数值'] == close_qty) &
+                        (df.loc[time_open_trades.index, '交易对ID'] == '')
+                    ]
+                    
+                    if len(matching_open) > 0:
+                        # 找到匹配的开仓,优先使用第一个
+                        open_idx = matching_open.index[0]
+                        pair_id = f'P{pair_id_counter:04d}'
+                        df.loc[open_idx, '交易对ID'] = pair_id
+                        df.loc[close_idx, '交易对ID'] = pair_id
+                        
+                        paired_count += 2
+                        pair_id_counter += 1
+                        print(f"  同时间1开1平匹配: {target} {direction} {close_qty:.0f}手 -> {pair_id}")
+            
+            # 第一阶段:多开1平匹配
+            # 遍历每笔平仓交易,查找可以合并匹配的多笔开仓
+            for close_idx in close_trades.index:
+                if df.loc[close_idx, '交易对ID'] != '':  # 已配对的平仓交易跳过
+                    continue
+                
+                close_time = df.loc[close_idx, '交易时间']
+                close_qty = df.loc[close_idx, '成交数量_数值']
+                
+                # 查找该平仓交易之前或同一时间的所有未配对开仓交易
+                # 注意:同一时间的1开1平已经在第零阶段处理,这里主要处理多开1平的情况
+                valid_open_trades = open_trades[
+                    (open_trades['交易时间'] <= close_time) & 
+                    (df.loc[open_trades.index, '交易对ID'] == '')  # 未被配对的开仓交易
+                ].copy()
+                
+                if len(valid_open_trades) == 0:
+                    continue
+                
+                # 尝试找到开仓数量之和等于平仓数量的组合
+                # 使用贪心算法:按时间顺序累加开仓数量
+                remaining_qty = close_qty
+                paired_open_indices = []
+                
+                for open_idx in valid_open_trades.index:
+                    open_qty = df.loc[open_idx, '成交数量_数值']
+                    
+                    if remaining_qty <= 0:
+                        break
+                    
+                    if open_qty <= remaining_qty:
+                        # 这笔开仓可以加入配对
+                        paired_open_indices.append(open_idx)
+                        remaining_qty -= open_qty
+                    else:
+                        # 开仓数量大于剩余需要量,跳过(保持精确匹配)
+                        continue
+                
+                # 如果找到精确匹配的组合(剩余数量为0或接近0)
+                if len(paired_open_indices) > 0 and abs(remaining_qty) < 0.01:
+                    # 为配对的交易分配相同的ID
+                    pair_id = f'P{pair_id_counter:04d}'
+                    for open_idx in paired_open_indices:
+                        df.loc[open_idx, '交易对ID'] = pair_id
+                    df.loc[close_idx, '交易对ID'] = pair_id
+                    
+                    paired_count += len(paired_open_indices) + 1  # 开仓+平仓
+                    pair_id_counter += 1
+                    
+                    if len(paired_open_indices) > 1:
+                        print(f"  多开1平匹配: {target} {direction} {len(paired_open_indices)}笔开仓({sum([df.loc[idx, '成交数量_数值'] for idx in paired_open_indices]):.0f}手) 匹配 1笔平仓({close_qty:.0f}手) -> {pair_id}")
+            
+            # 第二阶段:1开多平匹配(原有逻辑)
+            # 遍历每笔开仓交易,寻找匹配的平仓交易
+            for open_idx in open_trades.index:
+                # 跳过已配对的开仓交易
+                if df.loc[open_idx, '交易对ID'] != '':
+                    continue
+                
+                open_time = df.loc[open_idx, '交易时间']
+                open_qty = df.loc[open_idx, '成交数量_数值']
+                
+                # 查找该开仓交易之后的平仓交易
+                valid_close_trades = close_trades[
+                    (close_trades['交易时间'] >= open_time) & 
+                    (close_trades['交易对ID'] == '')  # 未被配对的平仓交易
+                ].copy()
+                
+                if len(valid_close_trades) == 0:
+                    # 没有找到匹配的平仓交易
+                    df.loc[open_idx, '交易对ID'] = '未配对'
+                    unpaired_count += 1
+                    continue
+                
+                # 累计平仓数量,直到等于开仓数量
+                remaining_qty = open_qty
+                paired_close_indices = []
+                
+                for close_idx in valid_close_trades.index:
+                    close_qty_val = df.loc[close_idx, '成交数量_数值']
+                    
+                    if remaining_qty <= 0:
+                        break
+                    
+                    if close_qty_val <= remaining_qty:
+                        # 这笔平仓完全匹配
+                        paired_close_indices.append(close_idx)
+                        remaining_qty -= close_qty_val
+                    elif close_qty_val > remaining_qty:
+                        # 这笔平仓数量大于剩余需要量,跳过(保持精确匹配)
+                        # 只有在剩余数量很小时才允许部分匹配
+                        continue
+                
+                # 为配对的交易分配相同的ID(要求精确匹配或接近精确匹配)
+                if len(paired_close_indices) > 0 and abs(remaining_qty) < 0.01:
+                    pair_id = f'P{pair_id_counter:04d}'
+                    df.loc[open_idx, '交易对ID'] = pair_id
+                    for close_idx in paired_close_indices:
+                        df.loc[close_idx, '交易对ID'] = pair_id
+                    
+                    paired_count += len(paired_close_indices) + 1  # 开仓+平仓
+                    pair_id_counter += 1
+                    
+                    if len(paired_close_indices) > 1:
+                        print(f"  1开多平匹配: {target} {direction} 1笔开仓({open_qty:.0f}手) 匹配 {len(paired_close_indices)}笔平仓({sum([df.loc[idx, '成交数量_数值'] for idx in paired_close_indices]):.0f}手) -> {pair_id}")
+                elif len(paired_close_indices) > 0:
+                    # 部分匹配,发出警告但不配对
+                    print(f"  警告: {target} {direction} 开仓在 {open_time} 有 {remaining_qty:.2f} 未配对,跳过配对")
+                    df.loc[open_idx, '交易对ID'] = '未配对'
+                    unpaired_count += 1
+                else:
+                    # 没有配对成功
+                    df.loc[open_idx, '交易对ID'] = '未配对'
+                    unpaired_count += 1
+    
+    # 统计信息
+    print(f"\n配对完成:")
+    print(f"  已配对交易: {paired_count} 条")
+    print(f"  未配对交易: {unpaired_count} 条")
+    print(f"  生成交易对: {pair_id_counter - 1} 对")
+    
+    # 后处理:修复不完整配对
+    df = fix_incomplete_pairs(df)
+    
+    return df
+
+
+def extract_symbol_core(symbol):
+    """
+    从标的字符串中提取标的核心字母
+
+    参数:
+        symbol (str): 标的字符串,如"10年期国债期货(T2006.CCFX)"
+
+    返回:
+        tuple: (括号内完整代码, 标的核心字母)
+    """
+    try:
+        # 提取括号内的内容
+        match = re.search(r'\(([^)]+)\)', symbol)
+        if match:
+            full_code = match.group(1)
+            # 去掉后面的9位获取标的核心字母
+            core_symbol = full_code[:-9] if len(full_code) > 9 else full_code
+            return full_code, core_symbol
+        else:
+            return symbol, symbol
+    except:
+        return symbol, symbol
+
+
+def identify_continuous_trade_pairs(df):
+    """
+    识别连续交易对
+
+    参数:
+        df (pandas.DataFrame): 包含交易对ID的交易数据
+
+    返回:
+        pandas.DataFrame: 添加了连续交易对ID的DataFrame
+    """
+    print("\n开始识别连续交易对...")
+
+    df = df.copy()
+    df['连续交易对ID'] = 'N/A'  # 初始化连续交易对ID列
+
+    # 提取标的核心字母
+    df['标的核心字母'] = df['标的'].apply(lambda x: extract_symbol_core(x)[1])
+
+    # 获取所有已配对的交易对ID
+    paired_mask = df['交易对ID'].str.startswith('P', na=False)
+    paired_df = df[paired_mask].copy()
+
+    if len(paired_df) == 0:
+        print("没有已配对的交易")
+        return df
+
+    # 按交易对ID分组
+    pair_groups = paired_df.groupby('交易对ID')
+
+    # 存储连续交易对关系
+    continuous_groups = []  # 每个元素是一组连续的交易对ID
+    processed_pairs = set()  # 已处理的交易对ID
+
+    for pair_id, group in pair_groups:
+        if pair_id in processed_pairs:
+            continue
+
+        # 获取当前交易对的开仓和平仓记录
+        open_trades = group[group['仓位操作'] == '开仓'].copy()
+        close_trades = group[group['仓位操作'] == '平仓'].copy()
+
+        if len(open_trades) == 0 or len(close_trades) == 0:
+            continue
+
+        # 获取当前交易对的关键信息
+        current_core_symbol = group['标的核心字母'].iloc[0]
+        current_direction = group['方向'].iloc[0]
+        current_close_date = close_trades['日期'].iloc[0]
+        current_close_time = close_trades['委托时间'].iloc[0]
+        current_close_qty = close_trades['成交数量_数值'].sum()
+
+        # 查找匹配的连续交易对
+        matching_pairs = []
+
+        for other_pair_id, other_group in pair_groups:
+            if other_pair_id == pair_id or other_pair_id in processed_pairs:
+                continue
+
+            # 获取另一个交易对的开仓和平仓记录
+            other_open_trades = other_group[other_group['仓位操作'] == '开仓'].copy()
+            other_close_trades = other_group[other_group['仓位操作'] == '平仓'].copy()
+
+            if len(other_open_trades) == 0 or len(other_close_trades) == 0:
+                continue
+
+            # 检查条件1:平仓和开仓的日期、委托时间完全一致
+            other_open_date = other_open_trades['日期'].iloc[0]
+            other_open_time = other_open_trades['委托时间'].iloc[0]
+
+            # 由于可能有多个开仓,检查是否有至少一个与平仓时间完全一致
+            time_match_found = False
+            for _, open_trade in other_open_trades.iterrows():
+                if (open_trade['日期'] == current_close_date and
+                    open_trade['委托时间'] == current_close_time):
+                    time_match_found = True
+                    break
+
+            if not time_match_found:
+                continue
+
+            # 检查条件2:交易类型匹配(平多对应开多,平空对应开空)
+            other_direction = other_group['方向'].iloc[0]
+            if current_direction != other_direction:
+                continue
+
+            # 检查条件3:标的核心字母一致
+            other_core_symbol = other_group['标的核心字母'].iloc[0]
+            if current_core_symbol != other_core_symbol:
+                continue
+
+            # 检查条件4:成交数量绝对值一致
+            other_open_qty = other_open_trades['成交数量_数值'].sum()
+            if abs(current_close_qty - other_open_qty) > 0.01:
+                continue
+
+            # 所有条件都满足,这是一个连续交易对
+            matching_pairs.append(other_pair_id)
+            processed_pairs.add(other_pair_id)
+
+        # 如果找到匹配的连续交易对
+        if matching_pairs:
+            # 创建连续交易对组(包含当前交易对和所有匹配的交易对)
+            continuous_group = [pair_id] + matching_pairs
+            continuous_groups.append(continuous_group)
+            processed_pairs.add(pair_id)
+
+            print(f"  发现连续交易对组: {continuous_group}")
+            print(f"    核心标的: {current_core_symbol}, 方向: {current_direction}")
+            print(f"    平仓时间: {current_close_date} {current_close_time}")
+            print(f"    平仓数量: {current_close_qty:.2f}, 开仓数量: {other_open_qty:.2f}")
+
+    # 为连续交易对分配ID
+    for i, continuous_group in enumerate(continuous_groups):
+        continuous_id = f'C{i+1:04d}'
+        for pair_id in continuous_group:
+            mask = df['交易对ID'] == pair_id
+            df.loc[mask, '连续交易对ID'] = continuous_id
+
+    print(f"\n识别完成,共发现 {len(continuous_groups)} 组连续交易对")
+
+    # 清理临时列
+    df = df.drop('标的核心字母', axis=1)
+
+    return df
+
+
+def save_result(df, output_path):
+    """
+    保存配对结果到CSV文件
+
+    参数:
+        df (pandas.DataFrame): 包含交易对ID的DataFrame
+        output_path (str): 输出文件路径
+    """
+    df = df.copy()
+
+    # 添加"开仓时间"列
+    # 对于每个交易对ID,找到对应的开仓记录的"最后更新时间"
+    df['开仓时间'] = ''
+
+    # 添加"交易盈亏"列,根据相同的交易对ID对平仓盈亏进行求和
+    df['交易盈亏'] = ''
+
+    # 添加"连续交易总盈亏"列
+    df['连续交易总盈亏'] = 'N/A'
+
+    # 先计算每个交易对的盈亏
+    for pair_id in df['交易对ID'].unique():
+        if pair_id and pair_id.startswith('P'):
+            # 找到该交易对的所有记录
+            pair_mask = df['交易对ID'] == pair_id
+            pair_records = df[pair_mask]
+
+            # 找到开仓记录(仓位操作为"开仓")
+            open_record = pair_records[pair_records['仓位操作'] == '开仓']
+
+            if len(open_record) > 0:
+                # 获取开仓记录的最后更新时间
+                open_time = open_record.iloc[0]['最后更新时间']
+                # 将开仓时间填充到该交易对的所有记录中
+                df.loc[pair_mask, '开仓时间'] = open_time
+
+            # 计算该交易对的总盈亏(对平仓盈亏求和)
+            try:
+                # 提取平仓盈亏列,转换为数值
+                # 先转换为字符串,去掉千位分隔符(逗号),然后转换为数值
+                close_profit_loss_str = pair_records['平仓盈亏'].astype(str).str.replace(',', '')
+                # 尝试转换为数值,无法转换的设为0
+                close_profit_loss_numeric = pd.to_numeric(close_profit_loss_str, errors='coerce').fillna(0)
+                total_profit_loss = close_profit_loss_numeric.sum()
+                # 将总盈亏填充到该交易对的所有记录中
+                df.loc[pair_mask, '交易盈亏'] = total_profit_loss
+            except Exception as e:
+                # 如果计算失败,设为0
+                df.loc[pair_mask, '交易盈亏'] = 0
+
+    # 计算连续交易总盈亏
+    for continuous_id in df['连续交易对ID'].unique():
+        if continuous_id != 'N/A' and pd.notna(continuous_id):
+            # 找到该连续交易组的所有记录
+            continuous_mask = df['连续交易对ID'] == continuous_id
+            continuous_records = df[continuous_mask]
+
+            # 计算该连续交易组的总盈亏
+            try:
+                # 提取平仓盈亏列,转换为数值
+                close_profit_loss_str = continuous_records['平仓盈亏'].astype(str).str.replace(',', '')
+                # 尝试转换为数值,无法转换的设为0
+                close_profit_loss_numeric = pd.to_numeric(close_profit_loss_str, errors='coerce').fillna(0)
+                total_continuous_profit = close_profit_loss_numeric.sum()
+                # 将连续交易总盈亏填充到该组的所有记录中
+                df.loc[continuous_mask, '连续交易总盈亏'] = total_continuous_profit
+            except Exception as e:
+                # 如果计算失败,设为0
+                df.loc[continuous_mask, '连续交易总盈亏'] = 0
+
+    # 移除中间处理列
+    columns_to_remove = ['标的_完整', '交易类型_标准', '仓位操作', '方向', '成交数量_数值', '交易时间']
+    output_columns = [col for col in df.columns if col not in columns_to_remove]
+
+    # 调整列顺序,确保交易对ID、连续交易对ID、开仓时间、交易盈亏和连续交易总盈亏在最后
+    if '交易对ID' in output_columns:
+        output_columns.remove('交易对ID')
+    if '连续交易对ID' in output_columns:
+        output_columns.remove('连续交易对ID')
+    if '开仓时间' in output_columns:
+        output_columns.remove('开仓时间')
+    if '交易盈亏' in output_columns:
+        output_columns.remove('交易盈亏')
+    if '连续交易总盈亏' in output_columns:
+        output_columns.remove('连续交易总盈亏')
+    output_columns.append('交易对ID')
+    output_columns.append('连续交易对ID')
+    output_columns.append('开仓时间')
+    output_columns.append('交易盈亏')
+    output_columns.append('连续交易总盈亏')
+    
+    # 按交易对ID和日期升序排序
+    # 创建排序辅助列:未配对的排在最后,其他按ID数字排序
+    def get_sort_key(pair_id):
+        if pd.isna(pair_id) or pair_id == '' or pair_id == '未配对':
+            return (1, '')  # 未配对排在最后
+        elif isinstance(pair_id, str) and pair_id.startswith('P'):
+            try:
+                # 提取数字部分用于排序
+                num = int(pair_id[1:])
+                return (0, num)  # 已配对的排在前面,按数字排序
+            except:
+                return (1, pair_id)
+        else:
+            return (1, str(pair_id))
+    
+    df['_sort_key_id'] = df['交易对ID'].apply(get_sort_key)
+    
+    # 确保日期列可以排序(转换为datetime类型)
+    if '日期' in df.columns:
+        df['_sort_date'] = pd.to_datetime(df['日期'], errors='coerce')
+    else:
+        df['_sort_date'] = pd.NaT
+    
+    # 先按交易对ID排序,再按日期排序
+    df = df.sort_values(['_sort_key_id', '_sort_date'], ascending=[True, True]).reset_index(drop=True)
+    df = df.drop(['_sort_key_id', '_sort_date'], axis=1)
+    
+    # 保存到CSV
+    try:
+        df[output_columns].to_csv(output_path, index=False, encoding='utf-8-sig')
+        print(f"\n结果已保存到: {output_path}")
+        
+        # 获取文件大小
+        file_size = os.path.getsize(output_path) / 1024  # KB
+        print(f"文件大小: {file_size:.2f} KB")
+    except Exception as e:
+        print(f"保存文件时出错: {str(e)}")
+
+
+def print_statistics(df):
+    """
+    打印配对统计信息
+
+    参数:
+        df (pandas.DataFrame): 包含交易对ID的DataFrame
+    """
+    print("\n" + "=" * 60)
+    print("配对统计信息")
+    print("=" * 60)
+
+    # 统计已配对和未配对
+    paired = df[df['交易对ID'].str.startswith('P', na=False)]
+    unpaired = df[df['交易对ID'] == '未配对']
+
+    print(f"\n总交易记录: {len(df)} 条")
+    print(f"已配对交易: {len(paired)} 条 ({len(paired)/len(df)*100:.1f}%)")
+    print(f"未配对交易: {len(unpaired)} 条 ({len(unpaired)/len(df)*100:.1f}%)")
+
+    # 统计交易对数量
+    unique_pairs = paired['交易对ID'].nunique()
+    print(f"\n交易对数量: {unique_pairs} 对")
+
+    # 统计连续交易对
+    if '连续交易对ID' in df.columns:
+        continuous_pairs = df[df['连续交易对ID'] != 'N/A']
+        unique_continuous_pairs = continuous_pairs['连续交易对ID'].nunique()
+        print(f"\n连续交易对统计:")
+        print(f"  连续交易对数量: {unique_continuous_pairs} 组")
+        print(f"  涉及交易记录: {len(continuous_pairs)} 条")
+        if len(continuous_pairs) > 0:
+            # 统计每组连续交易对的交易对数量
+            continuous_stats = continuous_pairs.groupby('连续交易对ID')['交易对ID'].nunique()
+            print(f"  每组连续交易对包含的交易对数量分布:")
+            for continuous_id, pair_count in continuous_stats.items():
+                print(f"    {continuous_id}: {pair_count} 个交易对")
+
+    # 统计每个交易对的平仓次数分布
+    if len(paired) > 0:
+        pair_counts = paired.groupby('交易对ID').size()
+        print(f"\n交易对组成分布:")
+        distribution = pair_counts.value_counts().sort_index()
+        for count, freq in distribution.items():
+            if count == 2:
+                print(f"  1开1平: {freq} 对")
+            else:
+                print(f"  1开{count-1}平: {freq} 对")
+
+    # 按标的统计
+    print(f"\n按标的统计:")
+    target_stats = df.groupby('标的_完整')['交易对ID'].apply(
+        lambda x: f"总:{len(x)}条, 已配对:{len(x[x.str.startswith('P', na=False)])}条"
+    )
+    for target, stats in target_stats.items():
+        print(f"  {target}: {stats}")
+
+
+def analyze_transaction_pairs(csv_filename=None, output_filename=None):
+    """
+    主函数:分析交易配对
+    
+    参数:
+        csv_filename (str): 输入CSV文件名
+        output_filename (str): 输出CSV文件名(可选)
+    """
+    print("=" * 60)
+    print("交易配对分析工具")
+    print("=" * 60)
+    
+    # 设置文件路径
+    if csv_filename is None:
+        csv_filename = 'transaction.csv'
+    
+    current_dir = _get_current_directory()
+    csv_path = os.path.join(current_dir, csv_filename)
+    
+    if not os.path.exists(csv_path):
+        print(f"错误: 文件不存在 - {csv_path}")
+        return
+    
+    # 设置输出文件名
+    if output_filename is None:
+        timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
+        base_name = os.path.splitext(csv_filename)[0]
+        output_filename = f"{base_name}_paired_{timestamp}.csv"
+    
+    output_path = os.path.join(current_dir, output_filename)
+    
+    # 步骤1: 读取CSV
+    print(f"\n步骤1: 读取CSV文件")
+    print(f"文件路径: {csv_path}")
+    df = read_transaction_csv(csv_path)
+    
+    if len(df) == 0:
+        print("错误: 无法读取数据")
+        return
+    
+    # 步骤2: 解析数据
+    print(f"\n步骤2: 解析交易数据")
+    df = parse_transaction_data(df)
+    
+    if len(df) == 0:
+        print("错误: 没有有效的交易记录")
+        return
+    
+    # 步骤3: 配对交易
+    print(f"\n步骤3: 配对交易")
+    df = pair_transactions(df)
+
+    # 步骤4: 识别连续交易对
+    print(f"\n步骤4: 识别连续交易对")
+    df = identify_continuous_trade_pairs(df)
+
+    # 步骤5: 保存结果
+    print(f"\n步骤5: 保存结果")
+    save_result(df, output_path)
+    
+    # 步骤6: 打印统计信息
+    print_statistics(df)
+
+    print("\n" + "=" * 60)
+    print("分析完成")
+    print("=" * 60)
+
+
+# 使用示例
+if __name__ == "__main__":
+    # 可以指定CSV文件名,如果不指定则使用默认的 transaction.csv
+    analyze_transaction_pairs()
+    # 或者指定特定文件: analyze_transaction_pairs('transaction.csv')
+

+ 153 - 0
Lib/stock/Multi-factor.py

@@ -0,0 +1,153 @@
+# 克隆自聚宽文章:https://www.joinquant.com/post/65807
+# 标题:11年26倍,简单的多因子策略
+# 作者:foolmouse
+# https://www.joinquant.com/view/community/detail/625cc5268324506f6746fc0c6c605c44
+# 很简单的几个随手可得的因子:
+
+# ROA高。
+# 归母净利润增长快。
+# PB排名小且小于2。
+# 就这些,没有其他花里胡哨诘屈聱牙的高频因子,没有复杂计算,只是简单的排序,我觉得比较满意的几点:
+
+# 没有现在满社区的小市值暴露(虽然我也有小市值仓位,但是不会是重仓)。
+# 所有因子都是理论基础很扎实的几个基础因子,即使暂时会失效长期看应该有效的。
+# 没有太多参数(只有一个PB小于2,这个存在后视镜,是根据结果优化的。去掉也行但是2015年股灾回撤比较大,如果实盘我会考虑指数估值什么的大盘择时,但是聚宽没有这个因子因此没用)
+# 在牛市后半程会跑输,但是大部分熊市后会赚回来的。
+
+
+'''
+1.市净率小于2;
+2.负债比例高于市场平均值;
+3.企业的流动资产至少是流动负债的1.2倍;
+4.每年四次调仓,即在1/4/7/10月调仓;
+5.可加入止损(十天HS300跌幅达10%清仓);
+'''
+
+## 初始化函数,设定要操作的股票、基准等等
+def initialize(context):
+    # 设定指数
+    g.stockindex = '000300.XSHG' 
+    # 设定沪深300作为基准
+    set_benchmark('000300.XSHG')
+    # True为开启动态复权模式,使用真实价格交易
+    set_option('use_real_price', True) 
+    # 设定成交量比例
+    set_option('order_volume_ratio', 1)
+    # 股票类交易手续费是:买入时佣金万分之三,卖出时佣金万分之三加千分之一印花税, 每笔交易佣金最低扣5块钱
+    set_order_cost(OrderCost(open_tax=0, close_tax=0.001, \
+                             open_commission=0.0003, close_commission=0.0003,\
+                             close_today_commission=0, min_commission=5), type='stock')
+    # 最大持仓数量
+    g.stocknum = 10
+
+    ## 自动设定调仓月份(如需使用自动,注销下段)
+    """
+    f = 4  # 调仓频率
+    log.info(range(1,13,12/f))
+    g.Transfer_date = range(1,13,12/f)
+    """
+    ## 手动设定调仓月份(如需使用手动,注释掉上段)
+    # g.Transfer_date = (3,9)
+    
+    #根据大盘止损,如不想加入大盘止损,注释下句即可
+    # run_daily(dapan_stoploss, time='open') 
+    
+    ## 按月调用程序
+    run_weekly(trade, weekday=1, time='open')
+
+## 交易函数
+def trade(context):
+    ## 获得Buylist
+    Buylist = check_stocks(context)
+    value = context.portfolio.total_value
+    hold_count = min(len(Buylist),g.stocknum)
+    per_value = value/hold_count if hold_count>0 else 0
+    ## 卖出
+    if len(context.portfolio.positions) > 0:
+        for stock in context.portfolio.positions.keys():
+            if stock not in Buylist:
+                order_target_value(stock, 0)
+            else:
+                order_target_value(stock,per_value)
+    ## 买入
+    if len(Buylist) > 0:
+        for stock in Buylist:
+           if stock not in context.portfolio.positions.keys():
+               order_target_value(stock,per_value)
+    
+## 选股函数
+def check_stocks(context):
+    # 获取所有股票
+    yesterday = context.previous_date
+    security = get_all_securities("stock", yesterday).index.tolist()
+    # 排除st,科创,新股
+    security = filter_st_stock(security)
+    security = filter_kcbj_stock(security)
+    security = filter_new_stock(context,security)
+    # 获取因子值
+    Stocks = get_fundamentals(query(
+            valuation.code,
+            valuation.pb_ratio,
+            indicator.roa,
+            indicator.inc_net_profit_to_shareholders_year_on_year
+        ).filter(
+            valuation.code.in_(security),
+            valuation.pb_ratio<2
+        ))
+    
+    stock_count = len(Stocks)
+    # 计算roa靠前的
+    Stocks = Stocks.sort_values(by='roa',ascending=False).iloc[:int(stock_count/10)]
+    # 计算增长率考前的
+    stock_count = len(Stocks)
+    Stocks = Stocks.sort_values(by='inc_net_profit_to_shareholders_year_on_year',ascending=False).iloc[:int(stock_count/10)]
+    # 计算PB最小的
+    Stocks = Stocks.sort_values(by='pb_ratio',ascending=True).iloc[:10]
+
+    Codes = Stocks.code
+
+    return list(Codes)
+# 过滤停牌股票
+def filter_paused_stock(stock_list):
+    current_data = get_current_data()
+    return [stock for stock in stock_list if not current_data[stock].paused]
+
+
+# 过滤ST及其他具有退市标签的股票
+def filter_st_stock(stock_list):
+    current_data = get_current_data()
+    return [stock for stock in stock_list
+            if not current_data[stock].is_st
+            and 'ST' not in current_data[stock].name
+            and '*' not in current_data[stock].name
+            and '退' not in current_data[stock].name]
+
+
+# 过滤科创北交股票
+def filter_kcbj_stock(stock_list):
+    for stock in stock_list[:]:
+        if stock[0] == '4' or stock[0] == '8' or stock[:2] == '68':
+            stock_list.remove(stock)
+    return stock_list
+
+
+# 过滤涨停的股票
+def filter_limitup_stock(context, stock_list):
+    last_prices = history(1, unit='1m', field='close', security_list=stock_list)
+    current_data = get_current_data()
+    return [stock for stock in stock_list if stock in context.portfolio.positions.keys()
+            or last_prices[stock][-1] < current_data[stock].high_limit]
+
+
+# 过滤跌停的股票
+def filter_limitdown_stock(context, stock_list):
+    last_prices = history(1, unit='1m', field='close', security_list=stock_list)
+    current_data = get_current_data()
+    return [stock for stock in stock_list if (stock in context.portfolio.positions.keys()
+            or last_prices[stock][-1] > current_data[stock].low_limit)]
+
+
+# 过滤次新股
+def filter_new_stock(context, stock_list):
+    yesterday = context.previous_date
+    return [stock for stock in stock_list if not yesterday - get_security_info(stock).start_date < datetime.timedelta(days=375)]

+ 2 - 0
pyproject.toml

@@ -7,6 +7,8 @@ requires-python = ">=3.11"
 dependencies = [
     "beautifulsoup4>=4.14.2",
     "matplotlib>=3.10.3",
+    "nltk>=3.9.2",
     "pandas>=2.3.1",
     "requests>=2.32.5",
+    "seaborn>=0.13.2",
 ]

+ 167 - 0
uv.lock

@@ -101,6 +101,27 @@ wheels = [
     { url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" },
 ]
 
+[[package]]
+name = "click"
+version = "8.3.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload-time = "2025-11-15T20:45:42.706Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" },
+]
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
+]
+
 [[package]]
 name = "contourpy"
 version = "1.3.2"
@@ -206,6 +227,15 @@ wheels = [
     { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
 ]
 
+[[package]]
+name = "joblib"
+version = "1.5.3"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/41/f2/d34e8b3a08a9cc79a50b2208a93dce981fe615b64d5a4d4abee421d898df/joblib-1.5.3.tar.gz", hash = "sha256:8561a3269e6801106863fd0d6d84bb737be9e7631e33aaed3fb9ce5953688da3", size = 331603, upload-time = "2025-12-15T08:41:46.427Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/7b/91/984aca2ec129e2757d1e4e3c81c3fcda9d0f85b74670a094cc443d9ee949/joblib-1.5.3-py3-none-any.whl", hash = "sha256:5fc3c5039fc5ca8c0276333a188bbd59d6b7ab37fe6632daa76bc7f9ec18e713", size = 309071, upload-time = "2025-12-15T08:41:44.973Z" },
+]
+
 [[package]]
 name = "jukuan"
 version = "0.1.0"
@@ -213,16 +243,20 @@ source = { virtual = "." }
 dependencies = [
     { name = "beautifulsoup4" },
     { name = "matplotlib" },
+    { name = "nltk" },
     { name = "pandas" },
     { name = "requests" },
+    { name = "seaborn" },
 ]
 
 [package.metadata]
 requires-dist = [
     { name = "beautifulsoup4", specifier = ">=4.14.2" },
     { name = "matplotlib", specifier = ">=3.10.3" },
+    { name = "nltk", specifier = ">=3.9.2" },
     { name = "pandas", specifier = ">=2.3.1" },
     { name = "requests", specifier = ">=2.32.5" },
+    { name = "seaborn", specifier = ">=0.13.2" },
 ]
 
 [[package]]
@@ -334,6 +368,21 @@ wheels = [
     { url = "https://files.pythonhosted.org/packages/1b/92/9a45c91089c3cf690b5badd4be81e392ff086ccca8a1d4e3a08463d8a966/matplotlib-3.10.3-cp313-cp313t-win_amd64.whl", hash = "sha256:4f23ffe95c5667ef8a2b56eea9b53db7f43910fa4a2d5472ae0f72b64deab4d5", size = 8139044, upload-time = "2025-05-08T19:10:44.551Z" },
 ]
 
+[[package]]
+name = "nltk"
+version = "3.9.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "click" },
+    { name = "joblib" },
+    { name = "regex" },
+    { name = "tqdm" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/f9/76/3a5e4312c19a028770f86fd7c058cf9f4ec4321c6cf7526bab998a5b683c/nltk-3.9.2.tar.gz", hash = "sha256:0f409e9b069ca4177c1903c3e843eef90c7e92992fa4931ae607da6de49e1419", size = 2887629, upload-time = "2025-10-01T07:19:23.764Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/60/90/81ac364ef94209c100e12579629dc92bf7a709a84af32f8c551b02c07e94/nltk-3.9.2-py3-none-any.whl", hash = "sha256:1e209d2b3009110635ed9709a67a1a3e33a10f799490fa71cf4bec218c11c88a", size = 1513404, upload-time = "2025-10-01T07:19:21.648Z" },
+]
+
 [[package]]
 name = "numpy"
 version = "2.3.1"
@@ -556,6 +605,98 @@ wheels = [
     { url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" },
 ]
 
+[[package]]
+name = "regex"
+version = "2025.11.3"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/cc/a9/546676f25e573a4cf00fe8e119b78a37b6a8fe2dc95cda877b30889c9c45/regex-2025.11.3.tar.gz", hash = "sha256:1fedc720f9bb2494ce31a58a1631f9c82df6a09b49c19517ea5cc280b4541e01", size = 414669, upload-time = "2025-11-03T21:34:22.089Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/f7/90/4fb5056e5f03a7048abd2b11f598d464f0c167de4f2a51aa868c376b8c70/regex-2025.11.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:eadade04221641516fa25139273505a1c19f9bf97589a05bc4cfcd8b4a618031", size = 488081, upload-time = "2025-11-03T21:31:11.946Z" },
+    { url = "https://files.pythonhosted.org/packages/85/23/63e481293fac8b069d84fba0299b6666df720d875110efd0338406b5d360/regex-2025.11.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:feff9e54ec0dd3833d659257f5c3f5322a12eee58ffa360984b716f8b92983f4", size = 290554, upload-time = "2025-11-03T21:31:13.387Z" },
+    { url = "https://files.pythonhosted.org/packages/2b/9d/b101d0262ea293a0066b4522dfb722eb6a8785a8c3e084396a5f2c431a46/regex-2025.11.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3b30bc921d50365775c09a7ed446359e5c0179e9e2512beec4a60cbcef6ddd50", size = 288407, upload-time = "2025-11-03T21:31:14.809Z" },
+    { url = "https://files.pythonhosted.org/packages/0c/64/79241c8209d5b7e00577ec9dca35cd493cc6be35b7d147eda367d6179f6d/regex-2025.11.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f99be08cfead2020c7ca6e396c13543baea32343b7a9a5780c462e323bd8872f", size = 793418, upload-time = "2025-11-03T21:31:16.556Z" },
+    { url = "https://files.pythonhosted.org/packages/3d/e2/23cd5d3573901ce8f9757c92ca4db4d09600b865919b6d3e7f69f03b1afd/regex-2025.11.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6dd329a1b61c0ee95ba95385fb0c07ea0d3fe1a21e1349fa2bec272636217118", size = 860448, upload-time = "2025-11-03T21:31:18.12Z" },
+    { url = "https://files.pythonhosted.org/packages/2a/4c/aecf31beeaa416d0ae4ecb852148d38db35391aac19c687b5d56aedf3a8b/regex-2025.11.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4c5238d32f3c5269d9e87be0cf096437b7622b6920f5eac4fd202468aaeb34d2", size = 907139, upload-time = "2025-11-03T21:31:20.753Z" },
+    { url = "https://files.pythonhosted.org/packages/61/22/b8cb00df7d2b5e0875f60628594d44dba283e951b1ae17c12f99e332cc0a/regex-2025.11.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:10483eefbfb0adb18ee9474498c9a32fcf4e594fbca0543bb94c48bac6183e2e", size = 800439, upload-time = "2025-11-03T21:31:22.069Z" },
+    { url = "https://files.pythonhosted.org/packages/02/a8/c4b20330a5cdc7a8eb265f9ce593f389a6a88a0c5f280cf4d978f33966bc/regex-2025.11.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:78c2d02bb6e1da0720eedc0bad578049cad3f71050ef8cd065ecc87691bed2b0", size = 782965, upload-time = "2025-11-03T21:31:23.598Z" },
+    { url = "https://files.pythonhosted.org/packages/b4/4c/ae3e52988ae74af4b04d2af32fee4e8077f26e51b62ec2d12d246876bea2/regex-2025.11.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e6b49cd2aad93a1790ce9cffb18964f6d3a4b0b3dbdbd5de094b65296fce6e58", size = 854398, upload-time = "2025-11-03T21:31:25.008Z" },
+    { url = "https://files.pythonhosted.org/packages/06/d1/a8b9cf45874eda14b2e275157ce3b304c87e10fb38d9fc26a6e14eb18227/regex-2025.11.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:885b26aa3ee56433b630502dc3d36ba78d186a00cc535d3806e6bfd9ed3c70ab", size = 845897, upload-time = "2025-11-03T21:31:26.427Z" },
+    { url = "https://files.pythonhosted.org/packages/ea/fe/1830eb0236be93d9b145e0bd8ab499f31602fe0999b1f19e99955aa8fe20/regex-2025.11.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ddd76a9f58e6a00f8772e72cff8ebcff78e022be95edf018766707c730593e1e", size = 788906, upload-time = "2025-11-03T21:31:28.078Z" },
+    { url = "https://files.pythonhosted.org/packages/66/47/dc2577c1f95f188c1e13e2e69d8825a5ac582ac709942f8a03af42ed6e93/regex-2025.11.3-cp311-cp311-win32.whl", hash = "sha256:3e816cc9aac1cd3cc9a4ec4d860f06d40f994b5c7b4d03b93345f44e08cc68bf", size = 265812, upload-time = "2025-11-03T21:31:29.72Z" },
+    { url = "https://files.pythonhosted.org/packages/50/1e/15f08b2f82a9bbb510621ec9042547b54d11e83cb620643ebb54e4eb7d71/regex-2025.11.3-cp311-cp311-win_amd64.whl", hash = "sha256:087511f5c8b7dfbe3a03f5d5ad0c2a33861b1fc387f21f6f60825a44865a385a", size = 277737, upload-time = "2025-11-03T21:31:31.422Z" },
+    { url = "https://files.pythonhosted.org/packages/f4/fc/6500eb39f5f76c5e47a398df82e6b535a5e345f839581012a418b16f9cc3/regex-2025.11.3-cp311-cp311-win_arm64.whl", hash = "sha256:1ff0d190c7f68ae7769cd0313fe45820ba07ffebfddfaa89cc1eb70827ba0ddc", size = 270290, upload-time = "2025-11-03T21:31:33.041Z" },
+    { url = "https://files.pythonhosted.org/packages/e8/74/18f04cb53e58e3fb107439699bd8375cf5a835eec81084e0bddbd122e4c2/regex-2025.11.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bc8ab71e2e31b16e40868a40a69007bc305e1109bd4658eb6cad007e0bf67c41", size = 489312, upload-time = "2025-11-03T21:31:34.343Z" },
+    { url = "https://files.pythonhosted.org/packages/78/3f/37fcdd0d2b1e78909108a876580485ea37c91e1acf66d3bb8e736348f441/regex-2025.11.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:22b29dda7e1f7062a52359fca6e58e548e28c6686f205e780b02ad8ef710de36", size = 291256, upload-time = "2025-11-03T21:31:35.675Z" },
+    { url = "https://files.pythonhosted.org/packages/bf/26/0a575f58eb23b7ebd67a45fccbc02ac030b737b896b7e7a909ffe43ffd6a/regex-2025.11.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3a91e4a29938bc1a082cc28fdea44be420bf2bebe2665343029723892eb073e1", size = 288921, upload-time = "2025-11-03T21:31:37.07Z" },
+    { url = "https://files.pythonhosted.org/packages/ea/98/6a8dff667d1af907150432cf5abc05a17ccd32c72a3615410d5365ac167a/regex-2025.11.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:08b884f4226602ad40c5d55f52bf91a9df30f513864e0054bad40c0e9cf1afb7", size = 798568, upload-time = "2025-11-03T21:31:38.784Z" },
+    { url = "https://files.pythonhosted.org/packages/64/15/92c1db4fa4e12733dd5a526c2dd2b6edcbfe13257e135fc0f6c57f34c173/regex-2025.11.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3e0b11b2b2433d1c39c7c7a30e3f3d0aeeea44c2a8d0bae28f6b95f639927a69", size = 864165, upload-time = "2025-11-03T21:31:40.559Z" },
+    { url = "https://files.pythonhosted.org/packages/f9/e7/3ad7da8cdee1ce66c7cd37ab5ab05c463a86ffeb52b1a25fe7bd9293b36c/regex-2025.11.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:87eb52a81ef58c7ba4d45c3ca74e12aa4b4e77816f72ca25258a85b3ea96cb48", size = 912182, upload-time = "2025-11-03T21:31:42.002Z" },
+    { url = "https://files.pythonhosted.org/packages/84/bd/9ce9f629fcb714ffc2c3faf62b6766ecb7a585e1e885eb699bcf130a5209/regex-2025.11.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a12ab1f5c29b4e93db518f5e3872116b7e9b1646c9f9f426f777b50d44a09e8c", size = 803501, upload-time = "2025-11-03T21:31:43.815Z" },
+    { url = "https://files.pythonhosted.org/packages/7c/0f/8dc2e4349d8e877283e6edd6c12bdcebc20f03744e86f197ab6e4492bf08/regex-2025.11.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7521684c8c7c4f6e88e35ec89680ee1aa8358d3f09d27dfbdf62c446f5d4c695", size = 787842, upload-time = "2025-11-03T21:31:45.353Z" },
+    { url = "https://files.pythonhosted.org/packages/f9/73/cff02702960bc185164d5619c0c62a2f598a6abff6695d391b096237d4ab/regex-2025.11.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7fe6e5440584e94cc4b3f5f4d98a25e29ca12dccf8873679a635638349831b98", size = 858519, upload-time = "2025-11-03T21:31:46.814Z" },
+    { url = "https://files.pythonhosted.org/packages/61/83/0e8d1ae71e15bc1dc36231c90b46ee35f9d52fab2e226b0e039e7ea9c10a/regex-2025.11.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:8e026094aa12b43f4fd74576714e987803a315c76edb6b098b9809db5de58f74", size = 850611, upload-time = "2025-11-03T21:31:48.289Z" },
+    { url = "https://files.pythonhosted.org/packages/c8/f5/70a5cdd781dcfaa12556f2955bf170cd603cb1c96a1827479f8faea2df97/regex-2025.11.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:435bbad13e57eb5606a68443af62bed3556de2f46deb9f7d4237bc2f1c9fb3a0", size = 789759, upload-time = "2025-11-03T21:31:49.759Z" },
+    { url = "https://files.pythonhosted.org/packages/59/9b/7c29be7903c318488983e7d97abcf8ebd3830e4c956c4c540005fcfb0462/regex-2025.11.3-cp312-cp312-win32.whl", hash = "sha256:3839967cf4dc4b985e1570fd8d91078f0c519f30491c60f9ac42a8db039be204", size = 266194, upload-time = "2025-11-03T21:31:51.53Z" },
+    { url = "https://files.pythonhosted.org/packages/1a/67/3b92df89f179d7c367be654ab5626ae311cb28f7d5c237b6bb976cd5fbbb/regex-2025.11.3-cp312-cp312-win_amd64.whl", hash = "sha256:e721d1b46e25c481dc5ded6f4b3f66c897c58d2e8cfdf77bbced84339108b0b9", size = 277069, upload-time = "2025-11-03T21:31:53.151Z" },
+    { url = "https://files.pythonhosted.org/packages/d7/55/85ba4c066fe5094d35b249c3ce8df0ba623cfd35afb22d6764f23a52a1c5/regex-2025.11.3-cp312-cp312-win_arm64.whl", hash = "sha256:64350685ff08b1d3a6fff33f45a9ca183dc1d58bbfe4981604e70ec9801bbc26", size = 270330, upload-time = "2025-11-03T21:31:54.514Z" },
+    { url = "https://files.pythonhosted.org/packages/e1/a7/dda24ebd49da46a197436ad96378f17df30ceb40e52e859fc42cac45b850/regex-2025.11.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c1e448051717a334891f2b9a620fe36776ebf3dd8ec46a0b877c8ae69575feb4", size = 489081, upload-time = "2025-11-03T21:31:55.9Z" },
+    { url = "https://files.pythonhosted.org/packages/19/22/af2dc751aacf88089836aa088a1a11c4f21a04707eb1b0478e8e8fb32847/regex-2025.11.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9b5aca4d5dfd7fbfbfbdaf44850fcc7709a01146a797536a8f84952e940cca76", size = 291123, upload-time = "2025-11-03T21:31:57.758Z" },
+    { url = "https://files.pythonhosted.org/packages/a3/88/1a3ea5672f4b0a84802ee9891b86743438e7c04eb0b8f8c4e16a42375327/regex-2025.11.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:04d2765516395cf7dda331a244a3282c0f5ae96075f728629287dfa6f76ba70a", size = 288814, upload-time = "2025-11-03T21:32:01.12Z" },
+    { url = "https://files.pythonhosted.org/packages/fb/8c/f5987895bf42b8ddeea1b315c9fedcfe07cadee28b9c98cf50d00adcb14d/regex-2025.11.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d9903ca42bfeec4cebedba8022a7c97ad2aab22e09573ce9976ba01b65e4361", size = 798592, upload-time = "2025-11-03T21:32:03.006Z" },
+    { url = "https://files.pythonhosted.org/packages/99/2a/6591ebeede78203fa77ee46a1c36649e02df9eaa77a033d1ccdf2fcd5d4e/regex-2025.11.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:639431bdc89d6429f6721625e8129413980ccd62e9d3f496be618a41d205f160", size = 864122, upload-time = "2025-11-03T21:32:04.553Z" },
+    { url = "https://files.pythonhosted.org/packages/94/d6/be32a87cf28cf8ed064ff281cfbd49aefd90242a83e4b08b5a86b38e8eb4/regex-2025.11.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f117efad42068f9715677c8523ed2be1518116d1c49b1dd17987716695181efe", size = 912272, upload-time = "2025-11-03T21:32:06.148Z" },
+    { url = "https://files.pythonhosted.org/packages/62/11/9bcef2d1445665b180ac7f230406ad80671f0fc2a6ffb93493b5dd8cd64c/regex-2025.11.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4aecb6f461316adf9f1f0f6a4a1a3d79e045f9b71ec76055a791affa3b285850", size = 803497, upload-time = "2025-11-03T21:32:08.162Z" },
+    { url = "https://files.pythonhosted.org/packages/e5/a7/da0dc273d57f560399aa16d8a68ae7f9b57679476fc7ace46501d455fe84/regex-2025.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3b3a5f320136873cc5561098dfab677eea139521cb9a9e8db98b7e64aef44cbc", size = 787892, upload-time = "2025-11-03T21:32:09.769Z" },
+    { url = "https://files.pythonhosted.org/packages/da/4b/732a0c5a9736a0b8d6d720d4945a2f1e6f38f87f48f3173559f53e8d5d82/regex-2025.11.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:75fa6f0056e7efb1f42a1c34e58be24072cb9e61a601340cc1196ae92326a4f9", size = 858462, upload-time = "2025-11-03T21:32:11.769Z" },
+    { url = "https://files.pythonhosted.org/packages/0c/f5/a2a03df27dc4c2d0c769220f5110ba8c4084b0bfa9ab0f9b4fcfa3d2b0fc/regex-2025.11.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:dbe6095001465294f13f1adcd3311e50dd84e5a71525f20a10bd16689c61ce0b", size = 850528, upload-time = "2025-11-03T21:32:13.906Z" },
+    { url = "https://files.pythonhosted.org/packages/d6/09/e1cd5bee3841c7f6eb37d95ca91cdee7100b8f88b81e41c2ef426910891a/regex-2025.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:454d9b4ae7881afbc25015b8627c16d88a597479b9dea82b8c6e7e2e07240dc7", size = 789866, upload-time = "2025-11-03T21:32:15.748Z" },
+    { url = "https://files.pythonhosted.org/packages/eb/51/702f5ea74e2a9c13d855a6a85b7f80c30f9e72a95493260193c07f3f8d74/regex-2025.11.3-cp313-cp313-win32.whl", hash = "sha256:28ba4d69171fc6e9896337d4fc63a43660002b7da53fc15ac992abcf3410917c", size = 266189, upload-time = "2025-11-03T21:32:17.493Z" },
+    { url = "https://files.pythonhosted.org/packages/8b/00/6e29bb314e271a743170e53649db0fdb8e8ff0b64b4f425f5602f4eb9014/regex-2025.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:bac4200befe50c670c405dc33af26dad5a3b6b255dd6c000d92fe4629f9ed6a5", size = 277054, upload-time = "2025-11-03T21:32:19.042Z" },
+    { url = "https://files.pythonhosted.org/packages/25/f1/b156ff9f2ec9ac441710764dda95e4edaf5f36aca48246d1eea3f1fd96ec/regex-2025.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:2292cd5a90dab247f9abe892ac584cb24f0f54680c73fcb4a7493c66c2bf2467", size = 270325, upload-time = "2025-11-03T21:32:21.338Z" },
+    { url = "https://files.pythonhosted.org/packages/20/28/fd0c63357caefe5680b8ea052131acbd7f456893b69cc2a90cc3e0dc90d4/regex-2025.11.3-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:1eb1ebf6822b756c723e09f5186473d93236c06c579d2cc0671a722d2ab14281", size = 491984, upload-time = "2025-11-03T21:32:23.466Z" },
+    { url = "https://files.pythonhosted.org/packages/df/ec/7014c15626ab46b902b3bcc4b28a7bae46d8f281fc7ea9c95e22fcaaa917/regex-2025.11.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1e00ec2970aab10dc5db34af535f21fcf32b4a31d99e34963419636e2f85ae39", size = 292673, upload-time = "2025-11-03T21:32:25.034Z" },
+    { url = "https://files.pythonhosted.org/packages/23/ab/3b952ff7239f20d05f1f99e9e20188513905f218c81d52fb5e78d2bf7634/regex-2025.11.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a4cb042b615245d5ff9b3794f56be4138b5adc35a4166014d31d1814744148c7", size = 291029, upload-time = "2025-11-03T21:32:26.528Z" },
+    { url = "https://files.pythonhosted.org/packages/21/7e/3dc2749fc684f455f162dcafb8a187b559e2614f3826877d3844a131f37b/regex-2025.11.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:44f264d4bf02f3176467d90b294d59bf1db9fe53c141ff772f27a8b456b2a9ed", size = 807437, upload-time = "2025-11-03T21:32:28.363Z" },
+    { url = "https://files.pythonhosted.org/packages/1b/0b/d529a85ab349c6a25d1ca783235b6e3eedf187247eab536797021f7126c6/regex-2025.11.3-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7be0277469bf3bd7a34a9c57c1b6a724532a0d235cd0dc4e7f4316f982c28b19", size = 873368, upload-time = "2025-11-03T21:32:30.4Z" },
+    { url = "https://files.pythonhosted.org/packages/7d/18/2d868155f8c9e3e9d8f9e10c64e9a9f496bb8f7e037a88a8bed26b435af6/regex-2025.11.3-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0d31e08426ff4b5b650f68839f5af51a92a5b51abd8554a60c2fbc7c71f25d0b", size = 914921, upload-time = "2025-11-03T21:32:32.123Z" },
+    { url = "https://files.pythonhosted.org/packages/2d/71/9d72ff0f354fa783fe2ba913c8734c3b433b86406117a8db4ea2bf1c7a2f/regex-2025.11.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e43586ce5bd28f9f285a6e729466841368c4a0353f6fd08d4ce4630843d3648a", size = 812708, upload-time = "2025-11-03T21:32:34.305Z" },
+    { url = "https://files.pythonhosted.org/packages/e7/19/ce4bf7f5575c97f82b6e804ffb5c4e940c62609ab2a0d9538d47a7fdf7d4/regex-2025.11.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:0f9397d561a4c16829d4e6ff75202c1c08b68a3bdbfe29dbfcdb31c9830907c6", size = 795472, upload-time = "2025-11-03T21:32:36.364Z" },
+    { url = "https://files.pythonhosted.org/packages/03/86/fd1063a176ffb7b2315f9a1b08d17b18118b28d9df163132615b835a26ee/regex-2025.11.3-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:dd16e78eb18ffdb25ee33a0682d17912e8cc8a770e885aeee95020046128f1ce", size = 868341, upload-time = "2025-11-03T21:32:38.042Z" },
+    { url = "https://files.pythonhosted.org/packages/12/43/103fb2e9811205e7386366501bc866a164a0430c79dd59eac886a2822950/regex-2025.11.3-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:ffcca5b9efe948ba0661e9df0fa50d2bc4b097c70b9810212d6b62f05d83b2dd", size = 854666, upload-time = "2025-11-03T21:32:40.079Z" },
+    { url = "https://files.pythonhosted.org/packages/7d/22/e392e53f3869b75804762c7c848bd2dd2abf2b70fb0e526f58724638bd35/regex-2025.11.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c56b4d162ca2b43318ac671c65bd4d563e841a694ac70e1a976ac38fcf4ca1d2", size = 799473, upload-time = "2025-11-03T21:32:42.148Z" },
+    { url = "https://files.pythonhosted.org/packages/4f/f9/8bd6b656592f925b6845fcbb4d57603a3ac2fb2373344ffa1ed70aa6820a/regex-2025.11.3-cp313-cp313t-win32.whl", hash = "sha256:9ddc42e68114e161e51e272f667d640f97e84a2b9ef14b7477c53aac20c2d59a", size = 268792, upload-time = "2025-11-03T21:32:44.13Z" },
+    { url = "https://files.pythonhosted.org/packages/e5/87/0e7d603467775ff65cd2aeabf1b5b50cc1c3708556a8b849a2fa4dd1542b/regex-2025.11.3-cp313-cp313t-win_amd64.whl", hash = "sha256:7a7c7fdf755032ffdd72c77e3d8096bdcb0eb92e89e17571a196f03d88b11b3c", size = 280214, upload-time = "2025-11-03T21:32:45.853Z" },
+    { url = "https://files.pythonhosted.org/packages/8d/d0/2afc6f8e94e2b64bfb738a7c2b6387ac1699f09f032d363ed9447fd2bb57/regex-2025.11.3-cp313-cp313t-win_arm64.whl", hash = "sha256:df9eb838c44f570283712e7cff14c16329a9f0fb19ca492d21d4b7528ee6821e", size = 271469, upload-time = "2025-11-03T21:32:48.026Z" },
+    { url = "https://files.pythonhosted.org/packages/31/e9/f6e13de7e0983837f7b6d238ad9458800a874bf37c264f7923e63409944c/regex-2025.11.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9697a52e57576c83139d7c6f213d64485d3df5bf84807c35fa409e6c970801c6", size = 489089, upload-time = "2025-11-03T21:32:50.027Z" },
+    { url = "https://files.pythonhosted.org/packages/a3/5c/261f4a262f1fa65141c1b74b255988bd2fa020cc599e53b080667d591cfc/regex-2025.11.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e18bc3f73bd41243c9b38a6d9f2366cd0e0137a9aebe2d8ff76c5b67d4c0a3f4", size = 291059, upload-time = "2025-11-03T21:32:51.682Z" },
+    { url = "https://files.pythonhosted.org/packages/8e/57/f14eeb7f072b0e9a5a090d1712741fd8f214ec193dba773cf5410108bb7d/regex-2025.11.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:61a08bcb0ec14ff4e0ed2044aad948d0659604f824cbd50b55e30b0ec6f09c73", size = 288900, upload-time = "2025-11-03T21:32:53.569Z" },
+    { url = "https://files.pythonhosted.org/packages/3c/6b/1d650c45e99a9b327586739d926a1cd4e94666b1bd4af90428b36af66dc7/regex-2025.11.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9c30003b9347c24bcc210958c5d167b9e4f9be786cb380a7d32f14f9b84674f", size = 799010, upload-time = "2025-11-03T21:32:55.222Z" },
+    { url = "https://files.pythonhosted.org/packages/99/ee/d66dcbc6b628ce4e3f7f0cbbb84603aa2fc0ffc878babc857726b8aab2e9/regex-2025.11.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4e1e592789704459900728d88d41a46fe3969b82ab62945560a31732ffc19a6d", size = 864893, upload-time = "2025-11-03T21:32:57.239Z" },
+    { url = "https://files.pythonhosted.org/packages/bf/2d/f238229f1caba7ac87a6c4153d79947fb0261415827ae0f77c304260c7d3/regex-2025.11.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6538241f45eb5a25aa575dbba1069ad786f68a4f2773a29a2bd3dd1f9de787be", size = 911522, upload-time = "2025-11-03T21:32:59.274Z" },
+    { url = "https://files.pythonhosted.org/packages/bd/3d/22a4eaba214a917c80e04f6025d26143690f0419511e0116508e24b11c9b/regex-2025.11.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce22519c989bb72a7e6b36a199384c53db7722fe669ba891da75907fe3587db", size = 803272, upload-time = "2025-11-03T21:33:01.393Z" },
+    { url = "https://files.pythonhosted.org/packages/84/b1/03188f634a409353a84b5ef49754b97dbcc0c0f6fd6c8ede505a8960a0a4/regex-2025.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:66d559b21d3640203ab9075797a55165d79017520685fb407b9234d72ab63c62", size = 787958, upload-time = "2025-11-03T21:33:03.379Z" },
+    { url = "https://files.pythonhosted.org/packages/99/6a/27d072f7fbf6fadd59c64d210305e1ff865cc3b78b526fd147db768c553b/regex-2025.11.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:669dcfb2e38f9e8c69507bace46f4889e3abbfd9b0c29719202883c0a603598f", size = 859289, upload-time = "2025-11-03T21:33:05.374Z" },
+    { url = "https://files.pythonhosted.org/packages/9a/70/1b3878f648e0b6abe023172dacb02157e685564853cc363d9961bcccde4e/regex-2025.11.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:32f74f35ff0f25a5021373ac61442edcb150731fbaa28286bbc8bb1582c89d02", size = 850026, upload-time = "2025-11-03T21:33:07.131Z" },
+    { url = "https://files.pythonhosted.org/packages/dd/d5/68e25559b526b8baab8e66839304ede68ff6727237a47727d240006bd0ff/regex-2025.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e6c7a21dffba883234baefe91bc3388e629779582038f75d2a5be918e250f0ed", size = 789499, upload-time = "2025-11-03T21:33:09.141Z" },
+    { url = "https://files.pythonhosted.org/packages/fc/df/43971264857140a350910d4e33df725e8c94dd9dee8d2e4729fa0d63d49e/regex-2025.11.3-cp314-cp314-win32.whl", hash = "sha256:795ea137b1d809eb6836b43748b12634291c0ed55ad50a7d72d21edf1cd565c4", size = 271604, upload-time = "2025-11-03T21:33:10.9Z" },
+    { url = "https://files.pythonhosted.org/packages/01/6f/9711b57dc6894a55faf80a4c1b5aa4f8649805cb9c7aef46f7d27e2b9206/regex-2025.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:9f95fbaa0ee1610ec0fc6b26668e9917a582ba80c52cc6d9ada15e30aa9ab9ad", size = 280320, upload-time = "2025-11-03T21:33:12.572Z" },
+    { url = "https://files.pythonhosted.org/packages/f1/7e/f6eaa207d4377481f5e1775cdeb5a443b5a59b392d0065f3417d31d80f87/regex-2025.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:dfec44d532be4c07088c3de2876130ff0fbeeacaa89a137decbbb5f665855a0f", size = 273372, upload-time = "2025-11-03T21:33:14.219Z" },
+    { url = "https://files.pythonhosted.org/packages/c3/06/49b198550ee0f5e4184271cee87ba4dfd9692c91ec55289e6282f0f86ccf/regex-2025.11.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ba0d8a5d7f04f73ee7d01d974d47c5834f8a1b0224390e4fe7c12a3a92a78ecc", size = 491985, upload-time = "2025-11-03T21:33:16.555Z" },
+    { url = "https://files.pythonhosted.org/packages/ce/bf/abdafade008f0b1c9da10d934034cb670432d6cf6cbe38bbb53a1cfd6cf8/regex-2025.11.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:442d86cf1cfe4faabf97db7d901ef58347efd004934da045c745e7b5bd57ac49", size = 292669, upload-time = "2025-11-03T21:33:18.32Z" },
+    { url = "https://files.pythonhosted.org/packages/f9/ef/0c357bb8edbd2ad8e273fcb9e1761bc37b8acbc6e1be050bebd6475f19c1/regex-2025.11.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:fd0a5e563c756de210bb964789b5abe4f114dacae9104a47e1a649b910361536", size = 291030, upload-time = "2025-11-03T21:33:20.048Z" },
+    { url = "https://files.pythonhosted.org/packages/79/06/edbb67257596649b8fb088d6aeacbcb248ac195714b18a65e018bf4c0b50/regex-2025.11.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf3490bcbb985a1ae97b2ce9ad1c0f06a852d5b19dde9b07bdf25bf224248c95", size = 807674, upload-time = "2025-11-03T21:33:21.797Z" },
+    { url = "https://files.pythonhosted.org/packages/f4/d9/ad4deccfce0ea336296bd087f1a191543bb99ee1c53093dcd4c64d951d00/regex-2025.11.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3809988f0a8b8c9dcc0f92478d6501fac7200b9ec56aecf0ec21f4a2ec4b6009", size = 873451, upload-time = "2025-11-03T21:33:23.741Z" },
+    { url = "https://files.pythonhosted.org/packages/13/75/a55a4724c56ef13e3e04acaab29df26582f6978c000ac9cd6810ad1f341f/regex-2025.11.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f4ff94e58e84aedb9c9fce66d4ef9f27a190285b451420f297c9a09f2b9abee9", size = 914980, upload-time = "2025-11-03T21:33:25.999Z" },
+    { url = "https://files.pythonhosted.org/packages/67/1e/a1657ee15bd9116f70d4a530c736983eed997b361e20ecd8f5ca3759d5c5/regex-2025.11.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eb542fd347ce61e1321b0a6b945d5701528dca0cd9759c2e3bb8bd57e47964d", size = 812852, upload-time = "2025-11-03T21:33:27.852Z" },
+    { url = "https://files.pythonhosted.org/packages/b8/6f/f7516dde5506a588a561d296b2d0044839de06035bb486b326065b4c101e/regex-2025.11.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d6c2d5919075a1f2e413c00b056ea0c2f065b3f5fe83c3d07d325ab92dce51d6", size = 795566, upload-time = "2025-11-03T21:33:32.364Z" },
+    { url = "https://files.pythonhosted.org/packages/d9/dd/3d10b9e170cc16fb34cb2cef91513cf3df65f440b3366030631b2984a264/regex-2025.11.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3f8bf11a4827cc7ce5a53d4ef6cddd5ad25595d3c1435ef08f76825851343154", size = 868463, upload-time = "2025-11-03T21:33:34.459Z" },
+    { url = "https://files.pythonhosted.org/packages/f5/8e/935e6beff1695aa9085ff83195daccd72acc82c81793df480f34569330de/regex-2025.11.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:22c12d837298651e5550ac1d964e4ff57c3f56965fc1812c90c9fb2028eaf267", size = 854694, upload-time = "2025-11-03T21:33:36.793Z" },
+    { url = "https://files.pythonhosted.org/packages/92/12/10650181a040978b2f5720a6a74d44f841371a3d984c2083fc1752e4acf6/regex-2025.11.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:62ba394a3dda9ad41c7c780f60f6e4a70988741415ae96f6d1bf6c239cf01379", size = 799691, upload-time = "2025-11-03T21:33:39.079Z" },
+    { url = "https://files.pythonhosted.org/packages/67/90/8f37138181c9a7690e7e4cb388debbd389342db3c7381d636d2875940752/regex-2025.11.3-cp314-cp314t-win32.whl", hash = "sha256:4bf146dca15cdd53224a1bf46d628bd7590e4a07fbb69e720d561aea43a32b38", size = 274583, upload-time = "2025-11-03T21:33:41.302Z" },
+    { url = "https://files.pythonhosted.org/packages/8f/cd/867f5ec442d56beb56f5f854f40abcfc75e11d10b11fdb1869dd39c63aaf/regex-2025.11.3-cp314-cp314t-win_amd64.whl", hash = "sha256:adad1a1bcf1c9e76346e091d22d23ac54ef28e1365117d99521631078dfec9de", size = 284286, upload-time = "2025-11-03T21:33:43.324Z" },
+    { url = "https://files.pythonhosted.org/packages/20/31/32c0c4610cbc070362bf1d2e4ea86d1ea29014d400a6d6c2486fcfd57766/regex-2025.11.3-cp314-cp314t-win_arm64.whl", hash = "sha256:c54f768482cef41e219720013cd05933b6f971d9562544d691c68699bf2b6801", size = 274741, upload-time = "2025-11-03T21:33:45.557Z" },
+]
+
 [[package]]
 name = "requests"
 version = "2.32.5"
@@ -571,6 +712,20 @@ wheels = [
     { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
 ]
 
+[[package]]
+name = "seaborn"
+version = "0.13.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "matplotlib" },
+    { name = "numpy" },
+    { name = "pandas" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/86/59/a451d7420a77ab0b98f7affa3a1d78a313d2f7281a57afb1a34bae8ab412/seaborn-0.13.2.tar.gz", hash = "sha256:93e60a40988f4d65e9f4885df477e2fdaff6b73a9ded434c1ab356dd57eefff7", size = 1457696, upload-time = "2024-01-25T13:21:52.551Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/83/11/00d3c3dfc25ad54e731d91449895a79e4bf2384dc3ac01809010ba88f6d5/seaborn-0.13.2-py3-none-any.whl", hash = "sha256:636f8336facf092165e27924f223d3c62ca560b1f2bb5dff7ab7fad265361987", size = 294914, upload-time = "2024-01-25T13:21:49.598Z" },
+]
+
 [[package]]
 name = "six"
 version = "1.17.0"
@@ -589,6 +744,18 @@ wheels = [
     { url = "https://files.pythonhosted.org/packages/14/a0/bb38d3b76b8cae341dad93a2dd83ab7462e6dbcdd84d43f54ee60a8dc167/soupsieve-2.8-py3-none-any.whl", hash = "sha256:0cc76456a30e20f5d7f2e14a98a4ae2ee4e5abdc7c5ea0aafe795f344bc7984c", size = 36679, upload-time = "2025-08-27T15:39:50.179Z" },
 ]
 
+[[package]]
+name = "tqdm"
+version = "4.67.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737, upload-time = "2024-11-24T20:12:22.481Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540, upload-time = "2024-11-24T20:12:19.698Z" },
+]
+
 [[package]]
 name = "typing-extensions"
 version = "4.15.0"

Some files were not shown because too many files changed in this diff