- Music Theory (Lesson, Quiz, Interactive Experience Features)
- Basic Tools (Tuner, Metronome, Keyboard)
- Music Score documentation
- Any unique selling points
- iOS 18.0+
- Xcode 14.0+
- Swift 5.0+
- Clone the repo
git clone https://github.com/username/project.git
-
Open the
.xcodeproj
file in Xcode -
Build and run the project
[Previous sections remain the same...]
- Swift 5.0+: Core programming language
- SwiftUI: Modern declarative UI framework
- UIKit: Traditional UI framework for custom components
- Observation: For reactive programming and data flow
- AVFoundation: For audio input, output and manipulation
- Swift Package Manager: Dependency management
-
Supabase
- Real-time database
- Authentication
- Storage
-
FastAPI
- Optical Music Recognition (oemer library)
Note: The FastAPI backend is currently in experimental phase. Run the repo in local host for the optical music recognition feature https://github.com/wyattcheang/notecraft_fastapi
- UI Layer (SwiftUI/UIKit)
↔️ View Models - View Models
↔️ Services - Services
↔️ Supabase/FastAPI - FastAPI
↔️ Database
- MVVM: Main architecture pattern
- Repository Pattern: For data access
- Dependency Injection: For better testability
- Observer Pattern: Using Combine for reactive updates
- Factory Pattern: For creating complex objects
[Rest of the README remains the same...]
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
- Your Name - Initial work - @wyattcheang
- Create an issue
- Email: wyattcheangwaihoe@icloud.com