3.1.0+1
This commit is contained in:
commit
30b2545f4c
431
CHANGELOG.md
Normal file
431
CHANGELOG.md
Normal file
@ -0,0 +1,431 @@
|
|||||||
|
## 3.1.0+1
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed error building MacOS library
|
||||||
|
|
||||||
|
## 3.1.0
|
||||||
|
|
||||||
|
### Breaking
|
||||||
|
|
||||||
|
Sorry for this breaking change. Unfortunately, it was necessary to fix stability issues on Android.
|
||||||
|
|
||||||
|
- `directory` is now required for `Isar.open()` and `Isar.openSync()`
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed a crash that occasionally occurred when opening Isar
|
||||||
|
- Fixed a schema migration issue
|
||||||
|
- Fixed an issue where embedded class renaming didn't work correctly
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Many internal improvements
|
||||||
|
- Performance improvements
|
||||||
|
|
||||||
|
## 3.0.6
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Add check to verify transactions are used for correct instance
|
||||||
|
- Add check to verify that async transactions are still active
|
||||||
|
- Fix upstream issue with opening databases
|
||||||
|
|
||||||
|
## 3.0.5
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Improved performance for all operations
|
||||||
|
- Added `maxSizeMiB` option to `Isar.open()` to specify the maximum size of the database file
|
||||||
|
- Significantly reduced native library size
|
||||||
|
- With the help of the community, the docs have been translated into a range of languages
|
||||||
|
- Improved API docs
|
||||||
|
- Added integration tests for more platforms to ensure high-quality releases
|
||||||
|
- Support for unicode paths on Windows
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed crash while opening Isar
|
||||||
|
- Fixed crash on older Android devices
|
||||||
|
- Fixed a native port that was not closed correctly in some cases
|
||||||
|
- Added swift version to podspec
|
||||||
|
- Fixed crash on Windows
|
||||||
|
- Fixed "IndexNotFound" error
|
||||||
|
|
||||||
|
## 3.0.4
|
||||||
|
|
||||||
|
REDACTED.
|
||||||
|
|
||||||
|
## 3.0.3
|
||||||
|
|
||||||
|
REDACTED.
|
||||||
|
|
||||||
|
## 3.0.2
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- The Inspector now supports creating objects and importing JSON
|
||||||
|
- Added Inspector check to make sure Chrome is used
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Added support for the latest analyzer
|
||||||
|
- Fixed native ports that were not closed correctly in some cases
|
||||||
|
- Added support for Ubuntu 18.04 and older
|
||||||
|
- Fixed issue with aborting transactions
|
||||||
|
- Fixed crash when invalid JSON was provided to `importJsonRaw()`
|
||||||
|
- Added missing `exportJsonSync()` and `exportJsonRawSync()`
|
||||||
|
- Fixed issue where secondary instance could not be selected in the Inspector
|
||||||
|
|
||||||
|
## 3.0.1
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Support for arm64 iOS Simulators
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed issue where `.anyOf()`, `.allOf()`, and `.oneOf()` could not be negated
|
||||||
|
- Fixed too low min-iOS version. The minimum supported is 11.0
|
||||||
|
- Fixed error during macOS App Store build
|
||||||
|
|
||||||
|
## 3.0.0
|
||||||
|
|
||||||
|
This release has been a lot of work! Thanks to everyone who contributed and joined the countless discussions. You are really awesome!
|
||||||
|
|
||||||
|
Special thanks to [@Jtplouffe](https://github.com/Jtplouffe) and [@Peyman](https://github.com/Viper-Bit) for their incredible work.
|
||||||
|
|
||||||
|
### Web support
|
||||||
|
|
||||||
|
This version does not support the web target yet. It will be back in the next version. Please continue using 2.5.0 if you need web support.
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Completely new Isar inspector that does not need to be installed anymore
|
||||||
|
- Extreme performance improvements for almost all operations (up to 50%)
|
||||||
|
- Support for embedded objects using `@embedded`
|
||||||
|
- Support for enums using `@enumerated`
|
||||||
|
- Vastly improved Isar binary format space efficiency resulting in about 20% smaller databases
|
||||||
|
- Added `id`, `byte`, `short` and `float` typedefs
|
||||||
|
- `IsarLinks` now support all `Set` methods based on the Isar `Id` of objects
|
||||||
|
- Added `download` option to `Isar.initializeIsarCore()` to download binaries automatically
|
||||||
|
- Added `replace` option for indexes
|
||||||
|
- Added verification for correct Isar binary version
|
||||||
|
- Added `collection.getSize()` and `collection.getSizeSync()`
|
||||||
|
- Added `query.anyOf()` and `query.allOf()` query modifiers
|
||||||
|
- Support for much more complex composite index queries
|
||||||
|
- Support for logical XOR and the `.oneOf()` query modifier
|
||||||
|
- Made providing a path optional
|
||||||
|
- The default Isar name is now `default` and stored in `dir/name.isar` and `dir/name.isar.lock`
|
||||||
|
- On non-web platforms, `IsarLink` and `IsarLinks` will load automatically
|
||||||
|
- `.putSync()`, `.putAllSync()` etc. will now save links recursively by default
|
||||||
|
- Added `isar.getSize()` and `isar.getSizeSync()`
|
||||||
|
- Added `linksLengthEqualTo()`, `linksIsEmpty()`, `linksIsNotEmpty()`, `linksLengthGreaterThan()`, `linksLengthLessThan()`, `linksLengthBetween()` and `linkIsNull()` filters
|
||||||
|
- Added `listLengthEqualTo()`, `listIsEmpty()`, `listIsNotEmpty()`, `listLengthGreaterThan()`, `listLengthLessThan()`, `listLengthBetween()` filters
|
||||||
|
- Added `isNotNull()` filters
|
||||||
|
- Added `compactOnLaunch` conditions to `Isar.open()` for automatic database compaction
|
||||||
|
- Added `isar.copyToFile()` which copies a compacted version of the database to a path
|
||||||
|
- Added check to verify that linked collections schemas are provided for opening an instance
|
||||||
|
- Apply default values from constructor during deserialization
|
||||||
|
- Added `isar.verify()` and `col.verify()` methods for checking database integrity in unit tests
|
||||||
|
- Added missing float and double queries and an `epsilon` parameter
|
||||||
|
|
||||||
|
### Breaking changes
|
||||||
|
|
||||||
|
- Removed `TypeConverter` support in favor of `@embedded` and `@enumerated`
|
||||||
|
- Removed `@Id()` and `@Size32()` annotations in favor of the `Id` and `short` types
|
||||||
|
- Changed the `schemas` parameter from named to positional
|
||||||
|
- The maximum size of objects is now 16MB
|
||||||
|
- Removed `replaceOnConflict` and `saveLinks` parameter from `collection.put()` and `collection.putAll()`
|
||||||
|
- Removed `isar` parameter from `Isar.txn()`, `Isar.writeTxn()`, `Isar.txnSync()` and `Isar.writeTxnSync()`
|
||||||
|
- Removed `query.repeat()`
|
||||||
|
- Removed `query.sortById()` and `query.distinctById()`
|
||||||
|
- Fixed `.or()` instead of `.and()` being used implicitly when combining filters
|
||||||
|
- Renamed multi-entry where clauses from `.yourListAnyEqualTo()` to `.yourListElementEqualTo()` to avoid confusion
|
||||||
|
- Isar will no longer create the provided directory. Make sure it exists before opening an Isar Instance.
|
||||||
|
- Changed the default index type for all `List`s to `IndexType.hash`
|
||||||
|
- Renamed `isar.getCollection()` to `isar.collection()`
|
||||||
|
- It is no longer allowed to extend or implement another collection
|
||||||
|
- Unsupported properties will no longer be ignored by default
|
||||||
|
- Renamed the `initialReturn` parameter to `fireImmediately`
|
||||||
|
- Renamed `Isar.initializeLibraries()` to `Isar.initializeIsarCore()`
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
There are too many fixes to list them all.
|
||||||
|
|
||||||
|
- A lot of link fixes and a slight behavior change to make them super reliable
|
||||||
|
- Fixed missing symbols on older Android phones
|
||||||
|
- Fixed composite queries
|
||||||
|
- Fixed various generator issues
|
||||||
|
- Fixed error retrieving the id property in a query
|
||||||
|
- Fixed missing symbols on 32-bit Android 5 & 6 devices
|
||||||
|
- Fixed inconsistent `null` handling in json export
|
||||||
|
- Fixed default directory issue on Android
|
||||||
|
- Fixed different where clauses returning duplicate results
|
||||||
|
- Fixed hash index issue where multiple list values resulted in the same hash
|
||||||
|
- Fixed edge case where creating a new index failed
|
||||||
|
|
||||||
|
## 2.5.0
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Support for Android x86 (32 bit emulator) and macOS arm64 (Apple Silicon)
|
||||||
|
- Greatly improved test coverage for sync methods
|
||||||
|
- `col.clear()` now resets the auto increment counter to `0`
|
||||||
|
- Significantly reduced Isar Core binary size (about 1.4MB -> 800KB)
|
||||||
|
|
||||||
|
### Minor Breaking
|
||||||
|
|
||||||
|
- Changed `initializeLibraries(Map<String, String> libraries)` to `initializeLibraries(Map<IsarAbi, String> libraries)`
|
||||||
|
- Changed min Dart SDK to `2.16.0`
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed issue with `IsarLink.saveSync()`
|
||||||
|
- Fixed `id` queries
|
||||||
|
- Fixed error thrown by `BroadcastChannel` in Firefox
|
||||||
|
- Fixed Isar Inspector connection issue
|
||||||
|
|
||||||
|
## 2.4.0
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Support for querying links
|
||||||
|
- Support for filtering and sorting links
|
||||||
|
- Added methods to update and count links without loading them
|
||||||
|
- Added `isLoaded` property to links
|
||||||
|
- Added methods to count the number of objects in a collection
|
||||||
|
- Big internal improvements
|
||||||
|
|
||||||
|
### Minor Breaking
|
||||||
|
|
||||||
|
- There are now different kinds of where clauses for dynamic queries
|
||||||
|
- `isar.getCollection()` no longer requires the name of the collection
|
||||||
|
- `Isar.instanceNames` now returns a `Set` instead of a `List`
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed iOS crash that frequently happened on older devices
|
||||||
|
- Fixed 32bit issue on Android
|
||||||
|
- Fixed link issues
|
||||||
|
- Fixed missing `BroadcastChannel` API for older Safari versions
|
||||||
|
|
||||||
|
## 2.2.1
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Reduced Isar web code size by 50%
|
||||||
|
- Made `directory` parameter of `Isar.open()` optional for web
|
||||||
|
- Made `name` parameter of `Isar.getInstance()` optional
|
||||||
|
- Added `Isar.defaultName` constant
|
||||||
|
- Enabled `TypeConverter`s with supertypes
|
||||||
|
- Added message if `TypeConverter` nullability doesn't match
|
||||||
|
- Added more tests
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fixed issue with date queries
|
||||||
|
- Fixed `FilterGroup.not` constructor (thanks for the PR @jtzell)
|
||||||
|
|
||||||
|
## 2.2.0
|
||||||
|
|
||||||
|
Isar now has full web support 🎉. No changes to your code required, just run it.
|
||||||
|
|
||||||
|
_Web passes all unit tests but is still considered beta for now._
|
||||||
|
|
||||||
|
### Minor Breaking
|
||||||
|
|
||||||
|
- Added `saveLinks` parameter to `.put()` and `.putAll()` which defaults to `false`
|
||||||
|
- Changed default `overrideChanges` parameter of `links.load()` to `true` to avoid unintended behavior
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Full web support!
|
||||||
|
- Improved write performance
|
||||||
|
- Added `deleteFromDisk` option to `isar.close()`
|
||||||
|
- Added `.reset()` and `.resetSync()` methods to `IsarLink` and `IsarLinks`
|
||||||
|
- Improved `links.save()` performance
|
||||||
|
- Added many tests
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- Fixed value of `null` dates to be `DateTime.fromMillisecondsSinceEpoch(0)`
|
||||||
|
- Fixed problem with migration
|
||||||
|
- Fixed incorrect list values for new properties (`[]` instead of `null`)
|
||||||
|
- Improved handling of link edge-cases
|
||||||
|
|
||||||
|
## 2.1.4
|
||||||
|
|
||||||
|
- Removed `path` dependency
|
||||||
|
- Fixed incorrect return value of `deleteByIndex()`
|
||||||
|
- Fixed wrong auto increment ids in some cases (thanks @robban112)
|
||||||
|
- Fixed an issue with `Isar.close()` (thanks @msxenon)
|
||||||
|
- Fixed `$` escaping in generated code (thanks @jtzell)
|
||||||
|
- Fixed broken link in pub.dev example page
|
||||||
|
|
||||||
|
## 2.1.0
|
||||||
|
|
||||||
|
`isar_connect` is now integrated into `isar`
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Added check for outdated generated files
|
||||||
|
- Added check for changed schema across isolates
|
||||||
|
- Added `Isar.openSync()`
|
||||||
|
- Added `col.importJsonRawSync()`, `col.importJsonSync()`, `query.exportJsonRawSync()`, `query.exportJsonSync()`
|
||||||
|
- Improved performance for queries
|
||||||
|
- Improved handling of ffi memory
|
||||||
|
- More tests
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- Fixed issue where imported json required existing ids
|
||||||
|
- Fixed issue with transaction handling (thanks @Peng-Qian for the awesome help)
|
||||||
|
- Fixed issue with `@Ignore` annotation not always working
|
||||||
|
- Fixed issue with `getByIndex()` not returning correct object id (thanks @jtzell)
|
||||||
|
|
||||||
|
## 2.0.0
|
||||||
|
|
||||||
|
### Breaking
|
||||||
|
|
||||||
|
- The id for non-final objects is now assigned automatically after `.put()` and `.putSync()`
|
||||||
|
- `double` and `List<double>` indexes can no longer be at the beginning of a composite index
|
||||||
|
- `List<double>` indexes can no longer be hashed
|
||||||
|
- `.greaterThan()`, `.lessThan()` and `.between()` filters and are now excluding for `double` values (`>=` -> `>`)
|
||||||
|
- Changed the default index type for lists to `IndexType.value`
|
||||||
|
- `IsarLink` and `IsarLinks` will no longer be initialized by Isar and must not be `nullable` or `late`.
|
||||||
|
- Dart `2.14` or higher is required
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Added API docs for all public methods
|
||||||
|
- Added `isar.clear()`, `isar.clearSync()`, `col.clear()` and `col.clearSync()`
|
||||||
|
- Added `col.filter()` as shortcut for `col.where().filter()`
|
||||||
|
- Added `include` parameter to `.greaterThan()` and `.lessThan()` filters and where clauses
|
||||||
|
- Added `includeLower` and `includeUpper` parameters to `.between()` filters and where clauses
|
||||||
|
- Added `Isar.autoIncrement` to allow non-nullable auto-incrementing ids
|
||||||
|
- `Isar.close()` now returns whether the last instance was closed
|
||||||
|
- List values in composite indexes are now of type `IndexType.hash` automatically
|
||||||
|
- Allowed multiple indexes on the same property
|
||||||
|
- Removed exported packages from API docs
|
||||||
|
- Improved generated code
|
||||||
|
- Improved Isar Core error messages
|
||||||
|
- Minor performance improvements
|
||||||
|
- Automatic XCode configuration
|
||||||
|
- Updated analyzer to `3.0.0`
|
||||||
|
- More tests
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- `IsarLink` and `IsarLinks` can now be final
|
||||||
|
- Fixed multi-entry index queries returning items multiple times in some cases
|
||||||
|
- Fixed `.anyLessThan()` and `.anyGreaterThan()` issues
|
||||||
|
- Fixed issues with backlinks
|
||||||
|
- Fixed issue where query only returned the first `99999` results
|
||||||
|
- Fixed issue with id where clauses
|
||||||
|
- Fixed default index type for lists and bytes
|
||||||
|
- Fixed issue where renaming indexes was not possible
|
||||||
|
- Fixed issue where wrong index name was used for `.getByX()` and `.deleteByX()`
|
||||||
|
- Fixed issue where composite indexes did not allow non-hashed Strings as last value
|
||||||
|
- Fixed issue where `@Ignore()` fields were not ignored
|
||||||
|
|
||||||
|
## 1.0.5
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Updated dependencies
|
||||||
|
|
||||||
|
### Fixes:
|
||||||
|
|
||||||
|
- Included desktop binaries
|
||||||
|
- Fixed "Cannot allocate memory" error on older iOS devices
|
||||||
|
- Fixed stripped binaries for iOS release builds
|
||||||
|
- Fixed IsarInspector issues (thanks to [RubenBez](https://github.com/RubenBez) and [rizzi37](https://github.com/rizzi37))
|
||||||
|
|
||||||
|
## 1.0.0+1
|
||||||
|
|
||||||
|
Added missing binaries
|
||||||
|
|
||||||
|
## 1.0.0
|
||||||
|
|
||||||
|
Switched from liblmdb to libmdbx for better performance, more stability and many internal improvements.
|
||||||
|
|
||||||
|
### Breaking
|
||||||
|
|
||||||
|
The internal database format has been changed to improve performance. Old databases do not work anymore!
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Fix issue with links being removed after object update
|
||||||
|
- Fix String index problems
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Support `greaterThan`, `lessThan` and `between` queries for String values
|
||||||
|
- Support for inheritance (enabled by default)
|
||||||
|
- Support for `final` properties and getters
|
||||||
|
- Support for `freezed` and other code generators
|
||||||
|
- Support getting / deleting objects by a key `col.deleteByName('Anne')`
|
||||||
|
- Support for list indexes (hash an element based)
|
||||||
|
- Generator now creates individual files instead of one big file
|
||||||
|
- Allow specifying the collection accessor name
|
||||||
|
- Unsupported properties are now ignored automatically
|
||||||
|
- Returns the assigned ids after `.put()` operations (objects are no longer mutated)
|
||||||
|
- Introduces `replaceOnConflict` option for `.put()` (instead of specifying it for index)
|
||||||
|
- many more...
|
||||||
|
|
||||||
|
### Internal
|
||||||
|
|
||||||
|
- Improve generated code
|
||||||
|
- Many new unit tests
|
||||||
|
|
||||||
|
## 0.4.0
|
||||||
|
|
||||||
|
### Breaking
|
||||||
|
|
||||||
|
- Remove `.where...In()` and `...In()` extension methods
|
||||||
|
- Split `.watch(lazy: bool)` into `.watch()` and `.watchLazy()`
|
||||||
|
- Remove `include` option for filters
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
|
||||||
|
- Generate id for JSON imports that don't have an id
|
||||||
|
- Enable `sortBy` and `thenBy` generation
|
||||||
|
|
||||||
|
### Enhancements
|
||||||
|
|
||||||
|
- Add `.optional()` and `.repeat()` query modifiers
|
||||||
|
- Support property queries
|
||||||
|
- Support query aggregation
|
||||||
|
- Support dynamic queries (for custom query languages)
|
||||||
|
- Support multi package configuration with `@ExternalCollection()`
|
||||||
|
- Add `caseSensitive` option to `.distinctBy()`
|
||||||
|
|
||||||
|
### Internal
|
||||||
|
|
||||||
|
- Change iOS linking
|
||||||
|
- Improve generated code
|
||||||
|
- Set up integration tests and improve unit tests
|
||||||
|
- Use CORE/0.4.0
|
||||||
|
|
||||||
|
## 0.2.0
|
||||||
|
|
||||||
|
- Link support
|
||||||
|
- Many improvements and fixes
|
||||||
|
|
||||||
|
## 0.1.0
|
||||||
|
|
||||||
|
- Support for links and backlinks
|
||||||
|
|
||||||
|
## 0.0.4
|
||||||
|
|
||||||
|
- Bugfixes and many improvements
|
||||||
|
|
||||||
|
## 0.0.2
|
||||||
|
|
||||||
|
Fix dependency issue
|
||||||
|
|
||||||
|
## 0.0.1
|
||||||
|
|
||||||
|
Initial release
|
202
LICENSE
Normal file
202
LICENSE
Normal file
@ -0,0 +1,202 @@
|
|||||||
|
|
||||||
|
Apache License
|
||||||
|
Version 2.0, January 2004
|
||||||
|
http://www.apache.org/licenses/
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||||
|
|
||||||
|
1. Definitions.
|
||||||
|
|
||||||
|
"License" shall mean the terms and conditions for use, reproduction,
|
||||||
|
and distribution as defined by Sections 1 through 9 of this document.
|
||||||
|
|
||||||
|
"Licensor" shall mean the copyright owner or entity authorized by
|
||||||
|
the copyright owner that is granting the License.
|
||||||
|
|
||||||
|
"Legal Entity" shall mean the union of the acting entity and all
|
||||||
|
other entities that control, are controlled by, or are under common
|
||||||
|
control with that entity. For the purposes of this definition,
|
||||||
|
"control" means (i) the power, direct or indirect, to cause the
|
||||||
|
direction or management of such entity, whether by contract or
|
||||||
|
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||||
|
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||||
|
|
||||||
|
"You" (or "Your") shall mean an individual or Legal Entity
|
||||||
|
exercising permissions granted by this License.
|
||||||
|
|
||||||
|
"Source" form shall mean the preferred form for making modifications,
|
||||||
|
including but not limited to software source code, documentation
|
||||||
|
source, and configuration files.
|
||||||
|
|
||||||
|
"Object" form shall mean any form resulting from mechanical
|
||||||
|
transformation or translation of a Source form, including but
|
||||||
|
not limited to compiled object code, generated documentation,
|
||||||
|
and conversions to other media types.
|
||||||
|
|
||||||
|
"Work" shall mean the work of authorship, whether in Source or
|
||||||
|
Object form, made available under the License, as indicated by a
|
||||||
|
copyright notice that is included in or attached to the work
|
||||||
|
(an example is provided in the Appendix below).
|
||||||
|
|
||||||
|
"Derivative Works" shall mean any work, whether in Source or Object
|
||||||
|
form, that is based on (or derived from) the Work and for which the
|
||||||
|
editorial revisions, annotations, elaborations, or other modifications
|
||||||
|
represent, as a whole, an original work of authorship. For the purposes
|
||||||
|
of this License, Derivative Works shall not include works that remain
|
||||||
|
separable from, or merely link (or bind by name) to the interfaces of,
|
||||||
|
the Work and Derivative Works thereof.
|
||||||
|
|
||||||
|
"Contribution" shall mean any work of authorship, including
|
||||||
|
the original version of the Work and any modifications or additions
|
||||||
|
to that Work or Derivative Works thereof, that is intentionally
|
||||||
|
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||||
|
or by an individual or Legal Entity authorized to submit on behalf of
|
||||||
|
the copyright owner. For the purposes of this definition, "submitted"
|
||||||
|
means any form of electronic, verbal, or written communication sent
|
||||||
|
to the Licensor or its representatives, including but not limited to
|
||||||
|
communication on electronic mailing lists, source code control systems,
|
||||||
|
and issue tracking systems that are managed by, or on behalf of, the
|
||||||
|
Licensor for the purpose of discussing and improving the Work, but
|
||||||
|
excluding communication that is conspicuously marked or otherwise
|
||||||
|
designated in writing by the copyright owner as "Not a Contribution."
|
||||||
|
|
||||||
|
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||||
|
on behalf of whom a Contribution has been received by Licensor and
|
||||||
|
subsequently incorporated within the Work.
|
||||||
|
|
||||||
|
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
copyright license to reproduce, prepare Derivative Works of,
|
||||||
|
publicly display, publicly perform, sublicense, and distribute the
|
||||||
|
Work and such Derivative Works in Source or Object form.
|
||||||
|
|
||||||
|
3. Grant of Patent License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
(except as stated in this section) patent license to make, have made,
|
||||||
|
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||||
|
where such license applies only to those patent claims licensable
|
||||||
|
by such Contributor that are necessarily infringed by their
|
||||||
|
Contribution(s) alone or by combination of their Contribution(s)
|
||||||
|
with the Work to which such Contribution(s) was submitted. If You
|
||||||
|
institute patent litigation against any entity (including a
|
||||||
|
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||||
|
or a Contribution incorporated within the Work constitutes direct
|
||||||
|
or contributory patent infringement, then any patent licenses
|
||||||
|
granted to You under this License for that Work shall terminate
|
||||||
|
as of the date such litigation is filed.
|
||||||
|
|
||||||
|
4. Redistribution. You may reproduce and distribute copies of the
|
||||||
|
Work or Derivative Works thereof in any medium, with or without
|
||||||
|
modifications, and in Source or Object form, provided that You
|
||||||
|
meet the following conditions:
|
||||||
|
|
||||||
|
(a) You must give any other recipients of the Work or
|
||||||
|
Derivative Works a copy of this License; and
|
||||||
|
|
||||||
|
(b) You must cause any modified files to carry prominent notices
|
||||||
|
stating that You changed the files; and
|
||||||
|
|
||||||
|
(c) You must retain, in the Source form of any Derivative Works
|
||||||
|
that You distribute, all copyright, patent, trademark, and
|
||||||
|
attribution notices from the Source form of the Work,
|
||||||
|
excluding those notices that do not pertain to any part of
|
||||||
|
the Derivative Works; and
|
||||||
|
|
||||||
|
(d) If the Work includes a "NOTICE" text file as part of its
|
||||||
|
distribution, then any Derivative Works that You distribute must
|
||||||
|
include a readable copy of the attribution notices contained
|
||||||
|
within such NOTICE file, excluding those notices that do not
|
||||||
|
pertain to any part of the Derivative Works, in at least one
|
||||||
|
of the following places: within a NOTICE text file distributed
|
||||||
|
as part of the Derivative Works; within the Source form or
|
||||||
|
documentation, if provided along with the Derivative Works; or,
|
||||||
|
within a display generated by the Derivative Works, if and
|
||||||
|
wherever such third-party notices normally appear. The contents
|
||||||
|
of the NOTICE file are for informational purposes only and
|
||||||
|
do not modify the License. You may add Your own attribution
|
||||||
|
notices within Derivative Works that You distribute, alongside
|
||||||
|
or as an addendum to the NOTICE text from the Work, provided
|
||||||
|
that such additional attribution notices cannot be construed
|
||||||
|
as modifying the License.
|
||||||
|
|
||||||
|
You may add Your own copyright statement to Your modifications and
|
||||||
|
may provide additional or different license terms and conditions
|
||||||
|
for use, reproduction, or distribution of Your modifications, or
|
||||||
|
for any such Derivative Works as a whole, provided Your use,
|
||||||
|
reproduction, and distribution of the Work otherwise complies with
|
||||||
|
the conditions stated in this License.
|
||||||
|
|
||||||
|
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||||
|
any Contribution intentionally submitted for inclusion in the Work
|
||||||
|
by You to the Licensor shall be under the terms and conditions of
|
||||||
|
this License, without any additional terms or conditions.
|
||||||
|
Notwithstanding the above, nothing herein shall supersede or modify
|
||||||
|
the terms of any separate license agreement you may have executed
|
||||||
|
with Licensor regarding such Contributions.
|
||||||
|
|
||||||
|
6. Trademarks. This License does not grant permission to use the trade
|
||||||
|
names, trademarks, service marks, or product names of the Licensor,
|
||||||
|
except as required for reasonable and customary use in describing the
|
||||||
|
origin of the Work and reproducing the content of the NOTICE file.
|
||||||
|
|
||||||
|
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||||
|
agreed to in writing, Licensor provides the Work (and each
|
||||||
|
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
implied, including, without limitation, any warranties or conditions
|
||||||
|
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||||
|
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||||
|
appropriateness of using or redistributing the Work and assume any
|
||||||
|
risks associated with Your exercise of permissions under this License.
|
||||||
|
|
||||||
|
8. Limitation of Liability. In no event and under no legal theory,
|
||||||
|
whether in tort (including negligence), contract, or otherwise,
|
||||||
|
unless required by applicable law (such as deliberate and grossly
|
||||||
|
negligent acts) or agreed to in writing, shall any Contributor be
|
||||||
|
liable to You for damages, including any direct, indirect, special,
|
||||||
|
incidental, or consequential damages of any character arising as a
|
||||||
|
result of this License or out of the use or inability to use the
|
||||||
|
Work (including but not limited to damages for loss of goodwill,
|
||||||
|
work stoppage, computer failure or malfunction, or any and all
|
||||||
|
other commercial damages or losses), even if such Contributor
|
||||||
|
has been advised of the possibility of such damages.
|
||||||
|
|
||||||
|
9. Accepting Warranty or Additional Liability. While redistributing
|
||||||
|
the Work or Derivative Works thereof, You may choose to offer,
|
||||||
|
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||||
|
or other liability obligations and/or rights consistent with this
|
||||||
|
License. However, in accepting such obligations, You may act only
|
||||||
|
on Your own behalf and on Your sole responsibility, not on behalf
|
||||||
|
of any other Contributor, and only if You agree to indemnify,
|
||||||
|
defend, and hold each Contributor harmless for any liability
|
||||||
|
incurred by, or claims asserted against, such Contributor by reason
|
||||||
|
of your accepting any such warranty or additional liability.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
APPENDIX: How to apply the Apache License to your work.
|
||||||
|
|
||||||
|
To apply the Apache License to your work, attach the following
|
||||||
|
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||||
|
replaced with your own identifying information. (Don't include
|
||||||
|
the brackets!) The text should be enclosed in the appropriate
|
||||||
|
comment syntax for the file format. We also recommend that a
|
||||||
|
file or class name and description of purpose be included on the
|
||||||
|
same "printed page" as the copyright notice for easier
|
||||||
|
identification within third-party archives.
|
||||||
|
|
||||||
|
Copyright 2022 Simon Leier
|
||||||
|
|
||||||
|
Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
you may not use this file except in compliance with the License.
|
||||||
|
You may obtain a copy of the License at
|
||||||
|
|
||||||
|
http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
Unless required by applicable law or agreed to in writing, software
|
||||||
|
distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
See the License for the specific language governing permissions and
|
||||||
|
limitations under the License.
|
267
README.md
Normal file
267
README.md
Normal file
@ -0,0 +1,267 @@
|
|||||||
|
<p align="center">
|
||||||
|
<a href="https://isar.dev">
|
||||||
|
<img src="https://raw.githubusercontent.com/isar/isar/main/.github/assets/isar.svg?sanitize=true" height="128">
|
||||||
|
</a>
|
||||||
|
<h1 align="center">Isar Database</h1>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<a href="https://pub.dev/packages/isar">
|
||||||
|
<img src="https://img.shields.io/pub/v/isar?label=pub.dev&labelColor=333940&logo=dart">
|
||||||
|
</a>
|
||||||
|
<a href="https://github.com/isar/isar/actions/workflows/test.yaml">
|
||||||
|
<img src="https://img.shields.io/github/actions/workflow/status/isar/isar/test.yaml?branch=main&label=tests&labelColor=333940&logo=github">
|
||||||
|
</a>
|
||||||
|
<a href="https://app.codecov.io/gh/isar/isar">
|
||||||
|
<img src="https://img.shields.io/codecov/c/github/isar/isar?logo=codecov&logoColor=fff&labelColor=333940">
|
||||||
|
</a>
|
||||||
|
<a href="https://t.me/isardb">
|
||||||
|
<img src="https://img.shields.io/static/v1?label=join&message=isardb&labelColor=333940&logo=telegram&logoColor=white&color=229ED9">
|
||||||
|
</a>
|
||||||
|
<a href="https://twitter.com/simonleier">
|
||||||
|
<img src="https://img.shields.io/twitter/follow/simcdev?style=social">
|
||||||
|
</a>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<a href="https://isar.dev">Quickstart</a> •
|
||||||
|
<a href="https://isar.dev/schema">Documentation</a> •
|
||||||
|
<a href="https://github.com/isar/isar/tree/main/examples/">Sample Apps</a> •
|
||||||
|
<a href="https://github.com/isar/isar/discussions">Support & Ideas</a> •
|
||||||
|
<a href="https://pub.dev/packages/isar">Pub.dev</a>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
> #### Isar [ee-zahr]:
|
||||||
|
>
|
||||||
|
> 1. River in Bavaria, Germany.
|
||||||
|
> 2. [Crazy fast](#benchmarks) NoSQL database that is a joy to use.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- 💙 **Made for Flutter**. Easy to use, no config, no boilerplate
|
||||||
|
- 🚀 **Highly scalable** The sky is the limit (pun intended)
|
||||||
|
- 🍭 **Feature rich**. Composite & multi-entry indexes, query modifiers, JSON support etc.
|
||||||
|
- ⏱ **Asynchronous**. Parallel query operations & multi-isolate support by default
|
||||||
|
- 🦄 **Open source**. Everything is open source and free forever!
|
||||||
|
|
||||||
|
Isar database can do much more (and we are just getting started)
|
||||||
|
|
||||||
|
- 🕵️ **Full-text search**. Make searching fast and fun
|
||||||
|
- 📱 **Multiplatform**. iOS, Android, Desktop
|
||||||
|
- 🧪 **ACID semantics**. Rely on database consistency
|
||||||
|
- 💃 **Static typing**. Compile-time checked and autocompleted queries
|
||||||
|
- ✨ **Beautiful documentation**. Readable, easy to understand and ever-improving
|
||||||
|
|
||||||
|
Join the [Telegram group](https://t.me/isardb) for discussion and sneak peeks of new versions of the DB.
|
||||||
|
|
||||||
|
If you want to say thank you, star us on GitHub and like us on pub.dev 🙌💙
|
||||||
|
|
||||||
|
## Quickstart
|
||||||
|
|
||||||
|
Holy smokes you're here! Let's get started on using the coolest Flutter database out there...
|
||||||
|
|
||||||
|
### 1. Add to pubspec.yaml
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
isar_version: &isar_version 3.1.0 # define the version to be used
|
||||||
|
|
||||||
|
dependencies:
|
||||||
|
isar: *isar_version
|
||||||
|
isar_flutter_libs: *isar_version # contains Isar Core
|
||||||
|
|
||||||
|
dev_dependencies:
|
||||||
|
isar_generator: *isar_version
|
||||||
|
build_runner: any
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Annotate a Collection
|
||||||
|
|
||||||
|
```dart
|
||||||
|
part 'email.g.dart';
|
||||||
|
|
||||||
|
@collection
|
||||||
|
class Email {
|
||||||
|
Id id = Isar.autoIncrement; // you can also use id = null to auto increment
|
||||||
|
|
||||||
|
@Index(type: IndexType.value)
|
||||||
|
String? title;
|
||||||
|
|
||||||
|
List<Recipient>? recipients;
|
||||||
|
|
||||||
|
@enumerated
|
||||||
|
Status status = Status.pending;
|
||||||
|
}
|
||||||
|
|
||||||
|
@embedded
|
||||||
|
class Recipient {
|
||||||
|
String? name;
|
||||||
|
|
||||||
|
String? address;
|
||||||
|
}
|
||||||
|
|
||||||
|
enum Status {
|
||||||
|
draft,
|
||||||
|
pending,
|
||||||
|
sent,
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Open a database instance
|
||||||
|
|
||||||
|
```dart
|
||||||
|
final dir = await getApplicationDocumentsDirectory();
|
||||||
|
final isar = await Isar.open(
|
||||||
|
[EmailSchema],
|
||||||
|
directory: dir.path,
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Query the database
|
||||||
|
|
||||||
|
```dart
|
||||||
|
final emails = await isar.emails.filter()
|
||||||
|
.titleContains('awesome', caseSensitive: false)
|
||||||
|
.sortByStatusDesc()
|
||||||
|
.limit(10)
|
||||||
|
.findAll();
|
||||||
|
```
|
||||||
|
|
||||||
|
## Isar Database Inspector
|
||||||
|
|
||||||
|
The Isar Inspector allows you to inspect the Isar instances & collections of your app in real-time. You can execute queries, edit properties, switch between instances and sort the data.
|
||||||
|
|
||||||
|
<img src="https://raw.githubusercontent.com/isar/isar/main/.github/assets/inspector.gif">
|
||||||
|
|
||||||
|
To launch the inspector, just run your Isar app in debug mode and open the Inspector link in the logs.
|
||||||
|
|
||||||
|
## CRUD operations
|
||||||
|
|
||||||
|
All basic crud operations are available via the `IsarCollection`.
|
||||||
|
|
||||||
|
```dart
|
||||||
|
final newEmail = Email()..title = 'Amazing new database';
|
||||||
|
|
||||||
|
await isar.writeTxn(() {
|
||||||
|
await isar.emails.put(newEmail); // insert & update
|
||||||
|
});
|
||||||
|
|
||||||
|
final existingEmail = await isar.emails.get(newEmail.id!); // get
|
||||||
|
|
||||||
|
await isar.writeTxn(() {
|
||||||
|
await isar.emails.delete(existingEmail.id!); // delete
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Queries
|
||||||
|
|
||||||
|
Isar database has a powerful query language that allows you to make use of your indexes, filter distinct objects, use complex `and()`, `or()` and `.xor()` groups, query links and sort the results.
|
||||||
|
|
||||||
|
```dart
|
||||||
|
final importantEmails = isar.emails
|
||||||
|
.where()
|
||||||
|
.titleStartsWith('Important') // use index
|
||||||
|
.limit(10)
|
||||||
|
.findAll()
|
||||||
|
|
||||||
|
final specificEmails = isar.emails
|
||||||
|
.filter()
|
||||||
|
.recipient((q) => q.nameEqualTo('David')) // query embedded objects
|
||||||
|
.or()
|
||||||
|
.titleMatches('*university*', caseSensitive: false) // title containing 'university' (case insensitive)
|
||||||
|
.findAll()
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Watchers
|
||||||
|
|
||||||
|
With Isar database, you can watch collections, objects, or queries. A watcher is notified after a transaction commits successfully and the target actually changes.
|
||||||
|
Watchers can be lazy and not reload the data or they can be non-lazy and fetch new results in the background.
|
||||||
|
|
||||||
|
```dart
|
||||||
|
Stream<void> collectionStream = isar.emails.watchLazy();
|
||||||
|
|
||||||
|
Stream<List<Post>> queryStream = importantEmails.watch();
|
||||||
|
|
||||||
|
queryStream.listen((newResult) {
|
||||||
|
// do UI updates
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
## Benchmarks
|
||||||
|
|
||||||
|
Benchmarks only give a rough idea of the performance of a database but as you can see, Isar NoSQL database is quite fast 😇
|
||||||
|
|
||||||
|
| <img src="https://raw.githubusercontent.com/isar/isar/main/.github/assets/benchmarks/insert.png" width="100%" /> | <img src="https://raw.githubusercontent.com/isar/isar/main/.github/assets/benchmarks/query.png" width="100%" /> |
|
||||||
|
| ---------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- |
|
||||||
|
| <img src="https://raw.githubusercontent.com/isar/isar/main/.github/assets/benchmarks/delete.png" width="100%" /> | <img src="https://raw.githubusercontent.com/isar/isar/main/.github/assets/benchmarks/size.png" width="100%" /> |
|
||||||
|
|
||||||
|
If you are interested in more benchmarks or want to check how Isar performs on your device you can run the [benchmarks](https://github.com/isar/isar_benchmark) yourself.
|
||||||
|
|
||||||
|
## Unit tests
|
||||||
|
|
||||||
|
If you want to use Isar database in unit tests or Dart code, call `await Isar.initializeIsarCore(download: true)` before using Isar in your tests.
|
||||||
|
|
||||||
|
Isar NoSQL database will automatically download the correct binary for your platform. You can also pass a `libraries` map to adjust the download location for each platform.
|
||||||
|
|
||||||
|
Make sure to use `flutter test -j 1` to avoid tests running in parallel. This would break the automatic download.
|
||||||
|
|
||||||
|
## Contributors ✨
|
||||||
|
|
||||||
|
Big thanks go to these wonderful people:
|
||||||
|
|
||||||
|
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
|
||||||
|
<!-- prettier-ignore-start -->
|
||||||
|
<!-- markdownlint-disable -->
|
||||||
|
<table>
|
||||||
|
<tbody>
|
||||||
|
<tr>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/AlexisL61"><img src="https://avatars.githubusercontent.com/u/30233189?v=4" width="100px;" alt=""/><br /><sub><b>Alexis</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/buraktabn"><img src="https://avatars.githubusercontent.com/u/49204989?v=4" width="100px;" alt=""/><br /><sub><b>Burak</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/CarloDotLog"><img src="https://avatars.githubusercontent.com/u/13763473?v=4" width="100px;" alt=""/><br /><sub><b>Carlo Loguercio</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/Frostedfox"><img src="https://avatars.githubusercontent.com/u/84601232?v=4" width="100px;" alt=""/><br /><sub><b>Frostedfox</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/hafeezrana"><img src="https://avatars.githubusercontent.com/u/87476445?v=4" width="100px;" alt=""/><br /><sub><b>Hafeez Rana</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/h1376h"><img src="https://avatars.githubusercontent.com/u/3498335?v=4" width="100px;" alt=""/><br /><sub><b>Hamed H.</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/Jtplouffe"><img src="https://avatars.githubusercontent.com/u/32107801?v=4" width="100px;" alt=""/><br /><sub><b>JT</b></sub></a></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ritksm"><img src="https://avatars.githubusercontent.com/u/111809?v=4" width="100px;" alt=""/><br /><sub><b>Jack Rivers</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/nohli"><img src="https://avatars.githubusercontent.com/u/43643339?v=4" width="100px;" alt=""/><br /><sub><b>Joachim Nohl</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/vothvovo"><img src="https://avatars.githubusercontent.com/u/20894472?v=4" width="100px;" alt=""/><br /><sub><b>Johnson</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/VoidxHoshi"><img src="https://avatars.githubusercontent.com/u/55886143?v=4" width="100px;" alt=""/><br /><sub><b>LaLucid</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/letyletylety"><img src="https://avatars.githubusercontent.com/u/16468579?v=4" width="100px;" alt=""/><br /><sub><b>Lety</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/lodisy"><img src="https://avatars.githubusercontent.com/u/8101584?v=4" width="100px;" alt=""/><br /><sub><b>Michael</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/Moseco"><img src="https://avatars.githubusercontent.com/u/10720298?v=4" width="100px;" alt=""/><br /><sub><b>Moseco</b></sub></a></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/inkomomutane"><img src="https://avatars.githubusercontent.com/u/57417802?v=4" width="100px;" alt=""/><br /><sub><b>Nelson Mutane</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/Viper-Bit"><img src="https://avatars.githubusercontent.com/u/24822764?v=4" width="100px;" alt=""/><br /><sub><b>Peyman</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/leisim"><img src="https://avatars.githubusercontent.com/u/13610195?v=4" width="100px;" alt=""/><br /><sub><b>Simon Leier</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ika020202"><img src="https://avatars.githubusercontent.com/u/42883378?v=4" width="100px;" alt=""/><br /><sub><b>Ura</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/blendthink"><img src="https://avatars.githubusercontent.com/u/32213113?v=4" width="100px;" alt=""/><br /><sub><b>blendthink</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/mnkeis"><img src="https://avatars.githubusercontent.com/u/41247357?v=4" width="100px;" alt=""/><br /><sub><b>mnkeis</b></sub></a></td>
|
||||||
|
<td align="center" valign="top" width="14.28%"><a href="https://github.com/nobkd"><img src="https://avatars.githubusercontent.com/u/44443899?v=4" width="100px;" alt=""/><br /><sub><b>nobkd</b></sub></a></td>
|
||||||
|
</tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
|
||||||
|
<!-- markdownlint-restore -->
|
||||||
|
<!-- prettier-ignore-end -->
|
||||||
|
|
||||||
|
<!-- ALL-CONTRIBUTORS-LIST:END -->
|
||||||
|
|
||||||
|
### License
|
||||||
|
|
||||||
|
```
|
||||||
|
Copyright 2022 Simon Leier
|
||||||
|
|
||||||
|
Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
you may not use this file except in compliance with the License.
|
||||||
|
You may obtain a copy of the License at
|
||||||
|
|
||||||
|
http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
Unless required by applicable law or agreed to in writing, software
|
||||||
|
distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
See the License for the specific language governing permissions and
|
||||||
|
limitations under the License.
|
||||||
|
```
|
11
analysis_options.yaml
Normal file
11
analysis_options.yaml
Normal file
@ -0,0 +1,11 @@
|
|||||||
|
include: package:very_good_analysis/analysis_options.yaml
|
||||||
|
|
||||||
|
analyzer:
|
||||||
|
exclude:
|
||||||
|
- "lib/src/native/bindings.dart"
|
||||||
|
|
||||||
|
errors:
|
||||||
|
cascade_invocations: ignore
|
||||||
|
avoid_positional_boolean_parameters: ignore
|
||||||
|
parameter_assignments: ignore
|
||||||
|
prefer_asserts_with_message: ignore
|
3
example/README.md
Normal file
3
example/README.md
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
## The fastest way to get started is by following the [Quickstart Guide](https://isar.dev/tutorials/quickstart.html)!
|
||||||
|
|
||||||
|
Have fun using Isar!
|
49
lib/isar.dart
Normal file
49
lib/isar.dart
Normal file
@ -0,0 +1,49 @@
|
|||||||
|
library isar;
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:developer';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/src/isar_connect_api.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart'
|
||||||
|
if (dart.library.html) 'package:isar/src/web/isar_web.dart';
|
||||||
|
import 'package:isar/src/native/isar_link_impl.dart'
|
||||||
|
if (dart.library.html) 'package:isar/src/web/isar_link_impl.dart';
|
||||||
|
import 'package:isar/src/native/open.dart'
|
||||||
|
if (dart.library.html) 'package:isar/src/web/open.dart';
|
||||||
|
import 'package:isar/src/native/split_words.dart'
|
||||||
|
if (dart.library.html) 'package:isar/src/web/split_words.dart';
|
||||||
|
import 'package:meta/meta.dart';
|
||||||
|
import 'package:meta/meta_meta.dart';
|
||||||
|
|
||||||
|
part 'src/annotations/backlink.dart';
|
||||||
|
part 'src/annotations/collection.dart';
|
||||||
|
part 'src/annotations/embedded.dart';
|
||||||
|
part 'src/annotations/enumerated.dart';
|
||||||
|
part 'src/annotations/ignore.dart';
|
||||||
|
part 'src/annotations/index.dart';
|
||||||
|
part 'src/annotations/name.dart';
|
||||||
|
part 'src/annotations/type.dart';
|
||||||
|
part 'src/isar.dart';
|
||||||
|
part 'src/isar_collection.dart';
|
||||||
|
part 'src/isar_connect.dart';
|
||||||
|
part 'src/isar_error.dart';
|
||||||
|
part 'src/isar_link.dart';
|
||||||
|
part 'src/isar_reader.dart';
|
||||||
|
part 'src/isar_writer.dart';
|
||||||
|
part 'src/query.dart';
|
||||||
|
part 'src/query_builder.dart';
|
||||||
|
part 'src/query_builder_extensions.dart';
|
||||||
|
part 'src/query_components.dart';
|
||||||
|
part 'src/schema/collection_schema.dart';
|
||||||
|
part 'src/schema/index_schema.dart';
|
||||||
|
part 'src/schema/link_schema.dart';
|
||||||
|
part 'src/schema/property_schema.dart';
|
||||||
|
part 'src/schema/schema.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef IsarUint8List = Uint8List;
|
||||||
|
|
||||||
|
const bool _kIsWeb = identical(0, 0.0);
|
11
lib/src/annotations/backlink.dart
Normal file
11
lib/src/annotations/backlink.dart
Normal file
@ -0,0 +1,11 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Annotation to create a backlink to an existing link.
|
||||||
|
@Target({TargetKind.field})
|
||||||
|
class Backlink {
|
||||||
|
/// Annotation to create a backlink to an existing link.
|
||||||
|
const Backlink({required this.to});
|
||||||
|
|
||||||
|
/// The Dart name of the target link.
|
||||||
|
final String to;
|
||||||
|
}
|
34
lib/src/annotations/collection.dart
Normal file
34
lib/src/annotations/collection.dart
Normal file
@ -0,0 +1,34 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Annotation to create an Isar collection.
|
||||||
|
const collection = Collection();
|
||||||
|
|
||||||
|
/// Annotation to create an Isar collection.
|
||||||
|
@Target({TargetKind.classType})
|
||||||
|
class Collection {
|
||||||
|
/// Annotation to create an Isar collection.
|
||||||
|
const Collection({
|
||||||
|
this.inheritance = true,
|
||||||
|
this.accessor,
|
||||||
|
this.ignore = const {},
|
||||||
|
});
|
||||||
|
|
||||||
|
/// Should properties and accessors of parent classes and mixins be included?
|
||||||
|
final bool inheritance;
|
||||||
|
|
||||||
|
/// Allows you to override the default collection accessor.
|
||||||
|
///
|
||||||
|
/// Example:
|
||||||
|
/// ```dart
|
||||||
|
/// @Collection(accessor: 'col')
|
||||||
|
/// class MyCol {
|
||||||
|
/// Id? id;
|
||||||
|
/// }
|
||||||
|
///
|
||||||
|
/// // access collection using: isar.col
|
||||||
|
/// ```
|
||||||
|
final String? accessor;
|
||||||
|
|
||||||
|
/// A list of properties or getter names that Isar should ignore.
|
||||||
|
final Set<String> ignore;
|
||||||
|
}
|
17
lib/src/annotations/embedded.dart
Normal file
17
lib/src/annotations/embedded.dart
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Annotation to nest objects of this type in collections.
|
||||||
|
const embedded = Embedded();
|
||||||
|
|
||||||
|
/// Annotation to nest objects of this type in collections.
|
||||||
|
@Target({TargetKind.classType})
|
||||||
|
class Embedded {
|
||||||
|
/// Annotation to nest objects of this type in collections.
|
||||||
|
const Embedded({this.inheritance = true, this.ignore = const {}});
|
||||||
|
|
||||||
|
/// Should properties and accessors of parent classes and mixins be included?
|
||||||
|
final bool inheritance;
|
||||||
|
|
||||||
|
/// A list of properties or getter names that Isar should ignore.
|
||||||
|
final Set<String> ignore;
|
||||||
|
}
|
33
lib/src/annotations/enumerated.dart
Normal file
33
lib/src/annotations/enumerated.dart
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Annotation to specify how an enum property should be serialized.
|
||||||
|
const enumerated = Enumerated(EnumType.ordinal);
|
||||||
|
|
||||||
|
/// Annotation to specify how an enum property should be serialized.
|
||||||
|
@Target({TargetKind.field, TargetKind.getter})
|
||||||
|
class Enumerated {
|
||||||
|
/// Annotation to specify how an enum property should be serialized.
|
||||||
|
const Enumerated(this.type, [this.property]);
|
||||||
|
|
||||||
|
/// How the enum property should be serialized.
|
||||||
|
final EnumType type;
|
||||||
|
|
||||||
|
/// The property to use for the enum values.
|
||||||
|
final String? property;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Enum type for enum values.
|
||||||
|
enum EnumType {
|
||||||
|
/// Stores the index of the enum as a byte value.
|
||||||
|
ordinal,
|
||||||
|
|
||||||
|
/// Stores the index of the enum as a 4-byte value. Use this type if your enum
|
||||||
|
/// has more than 256 values or needs to be nullable.
|
||||||
|
ordinal32,
|
||||||
|
|
||||||
|
/// Uses the name of the enum value.
|
||||||
|
name,
|
||||||
|
|
||||||
|
/// Uses a custom enum value.
|
||||||
|
value
|
||||||
|
}
|
11
lib/src/annotations/ignore.dart
Normal file
11
lib/src/annotations/ignore.dart
Normal file
@ -0,0 +1,11 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Annotate a property or accessor in an Isar collection to ignore it.
|
||||||
|
const ignore = Ignore();
|
||||||
|
|
||||||
|
/// Annotate a property or accessor in an Isar collection to ignore it.
|
||||||
|
@Target({TargetKind.field, TargetKind.getter})
|
||||||
|
class Ignore {
|
||||||
|
/// Annotate a property or accessor in an Isar collection to ignore it.
|
||||||
|
const Ignore();
|
||||||
|
}
|
76
lib/src/annotations/index.dart
Normal file
76
lib/src/annotations/index.dart
Normal file
@ -0,0 +1,76 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Specifies how an index is stored in Isar.
|
||||||
|
enum IndexType {
|
||||||
|
/// Stores the value as-is in the index.
|
||||||
|
value,
|
||||||
|
|
||||||
|
/// Strings or Lists can be hashed to reduce the storage required by the
|
||||||
|
/// index. The disadvantage of hash indexes is that they can't be used for
|
||||||
|
/// prefix scans (`startsWith()` where clauses). String and list indexes are
|
||||||
|
/// hashed by default.
|
||||||
|
hash,
|
||||||
|
|
||||||
|
/// `List<String>` can hash its elements.
|
||||||
|
hashElements,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Annotate properties to build an index.
|
||||||
|
@Target({TargetKind.field, TargetKind.getter})
|
||||||
|
class Index {
|
||||||
|
/// Annotate properties to build an index.
|
||||||
|
const Index({
|
||||||
|
this.name,
|
||||||
|
this.composite = const [],
|
||||||
|
this.unique = false,
|
||||||
|
this.replace = false,
|
||||||
|
this.type,
|
||||||
|
this.caseSensitive,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// Name of the index. By default, the names of the properties are
|
||||||
|
/// concatenated using "_"
|
||||||
|
final String? name;
|
||||||
|
|
||||||
|
/// Specify up to two other properties to build a composite index.
|
||||||
|
final List<CompositeIndex> composite;
|
||||||
|
|
||||||
|
/// A unique index ensures the index does not contain any duplicate values.
|
||||||
|
/// Any attempt to insert or update data into the unique index that causes a
|
||||||
|
/// duplicate will result in an error.
|
||||||
|
final bool unique;
|
||||||
|
|
||||||
|
/// If set to `true`, inserting a duplicate unique value will replace the
|
||||||
|
/// existing object instead of throwing an error.
|
||||||
|
final bool replace;
|
||||||
|
|
||||||
|
/// Specifies how an index is stored in Isar.
|
||||||
|
///
|
||||||
|
/// Defaults to:
|
||||||
|
/// - `IndexType.hash` for `String`s and `List`s
|
||||||
|
/// - `IndexType.value` for all other types
|
||||||
|
final IndexType? type;
|
||||||
|
|
||||||
|
/// String or `List<String>` indexes can be case sensitive (default) or case
|
||||||
|
/// insensitive.
|
||||||
|
final bool? caseSensitive;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Another property that is part of the composite index.
|
||||||
|
class CompositeIndex {
|
||||||
|
/// Another property that is part of the composite index.
|
||||||
|
const CompositeIndex(
|
||||||
|
this.property, {
|
||||||
|
this.type,
|
||||||
|
this.caseSensitive,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// Dart name of the property.
|
||||||
|
final String property;
|
||||||
|
|
||||||
|
/// See [Index.type].
|
||||||
|
final IndexType? type;
|
||||||
|
|
||||||
|
/// See [Index.caseSensitive].
|
||||||
|
final bool? caseSensitive;
|
||||||
|
}
|
13
lib/src/annotations/name.dart
Normal file
13
lib/src/annotations/name.dart
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Annotate Isar collections or properties to change their name.
|
||||||
|
///
|
||||||
|
/// Can be used to change the name in Dart independently of Isar.
|
||||||
|
@Target({TargetKind.classType, TargetKind.field, TargetKind.getter})
|
||||||
|
class Name {
|
||||||
|
/// Annotate Isar collections or properties to change their name.
|
||||||
|
const Name(this.name);
|
||||||
|
|
||||||
|
/// The name this entity should have in the database.
|
||||||
|
final String name;
|
||||||
|
}
|
20
lib/src/annotations/type.dart
Normal file
20
lib/src/annotations/type.dart
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
// ignore_for_file: camel_case_types
|
||||||
|
|
||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Type to specify the id property of a collection.
|
||||||
|
typedef Id = int;
|
||||||
|
|
||||||
|
/// Type to mark an [int] property or List as 8-bit sized.
|
||||||
|
///
|
||||||
|
/// You may only store values between 0 and 255 in such a property.
|
||||||
|
typedef byte = int;
|
||||||
|
|
||||||
|
/// Type to mark an [int] property or List as 32-bit sized.
|
||||||
|
///
|
||||||
|
/// You may only store values between -2147483648 and 2147483647 in such a
|
||||||
|
/// property.
|
||||||
|
typedef short = int;
|
||||||
|
|
||||||
|
/// Type to mark a [double] property or List to have 32-bit precision.
|
||||||
|
typedef float = double;
|
222
lib/src/common/isar_common.dart
Normal file
222
lib/src/common/isar_common.dart
Normal file
@ -0,0 +1,222 @@
|
|||||||
|
// ignore_for_file: invalid_use_of_protected_member
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
const Symbol _zoneTxn = #zoneTxn;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
abstract class IsarCommon extends Isar {
|
||||||
|
/// @nodoc
|
||||||
|
IsarCommon(super.name);
|
||||||
|
|
||||||
|
final List<Future<void>> _activeAsyncTxns = [];
|
||||||
|
var _asyncWriteTxnsActive = 0;
|
||||||
|
|
||||||
|
Transaction? _currentTxnSync;
|
||||||
|
|
||||||
|
void _requireNotInTxn() {
|
||||||
|
if (_currentTxnSync != null || Zone.current[_zoneTxn] != null) {
|
||||||
|
throw IsarError(
|
||||||
|
'Cannot perform this operation from within an active transaction. '
|
||||||
|
'Isar does not support nesting transactions.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Future<Transaction> beginTxn(bool write, bool silent);
|
||||||
|
|
||||||
|
Future<T> _beginTxn<T>(
|
||||||
|
bool write,
|
||||||
|
bool silent,
|
||||||
|
Future<T> Function() callback,
|
||||||
|
) async {
|
||||||
|
requireOpen();
|
||||||
|
_requireNotInTxn();
|
||||||
|
|
||||||
|
final completer = Completer<void>();
|
||||||
|
_activeAsyncTxns.add(completer.future);
|
||||||
|
|
||||||
|
try {
|
||||||
|
if (write) {
|
||||||
|
_asyncWriteTxnsActive++;
|
||||||
|
}
|
||||||
|
|
||||||
|
final txn = await beginTxn(write, silent);
|
||||||
|
|
||||||
|
final zone = Zone.current.fork(
|
||||||
|
zoneValues: {_zoneTxn: txn},
|
||||||
|
);
|
||||||
|
|
||||||
|
T result;
|
||||||
|
try {
|
||||||
|
result = await zone.run(callback);
|
||||||
|
await txn.commit();
|
||||||
|
} catch (e) {
|
||||||
|
await txn.abort();
|
||||||
|
rethrow;
|
||||||
|
} finally {
|
||||||
|
txn.free();
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
} finally {
|
||||||
|
completer.complete();
|
||||||
|
_activeAsyncTxns.remove(completer.future);
|
||||||
|
if (write) {
|
||||||
|
_asyncWriteTxnsActive--;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<T> txn<T>(Future<T> Function() callback) {
|
||||||
|
return _beginTxn(false, false, callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<T> writeTxn<T>(Future<T> Function() callback, {bool silent = false}) {
|
||||||
|
return _beginTxn(true, silent, callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Future<R> getTxn<R, T extends Transaction>(
|
||||||
|
bool write,
|
||||||
|
Future<R> Function(T txn) callback,
|
||||||
|
) {
|
||||||
|
final currentTxn = Zone.current[_zoneTxn] as T?;
|
||||||
|
if (currentTxn != null) {
|
||||||
|
if (!currentTxn.active) {
|
||||||
|
throw IsarError('Transaction is not active anymore. Make sure to await '
|
||||||
|
'all your asynchronous code within transactions to prevent it from '
|
||||||
|
'being closed prematurely.');
|
||||||
|
} else if (write && !currentTxn.write) {
|
||||||
|
throw IsarError('Operation cannot be performed within a read '
|
||||||
|
'transaction. Use isar.writeTxn() instead.');
|
||||||
|
} else if (currentTxn.isar != this) {
|
||||||
|
throw IsarError('Transaction does not match Isar instance. '
|
||||||
|
'Make sure to use transactions from the same Isar instance.');
|
||||||
|
}
|
||||||
|
return callback(currentTxn);
|
||||||
|
} else if (!write) {
|
||||||
|
return _beginTxn(false, false, () {
|
||||||
|
return callback(Zone.current[_zoneTxn] as T);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
throw IsarError('Write operations require an explicit transaction. '
|
||||||
|
'Wrap your code in isar.writeTxn()');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Transaction beginTxnSync(bool write, bool silent);
|
||||||
|
|
||||||
|
T _beginTxnSync<T>(bool write, bool silent, T Function() callback) {
|
||||||
|
requireOpen();
|
||||||
|
_requireNotInTxn();
|
||||||
|
|
||||||
|
if (write && _asyncWriteTxnsActive > 0) {
|
||||||
|
throw IsarError(
|
||||||
|
'An async write transaction is already in progress in this isolate. '
|
||||||
|
'You cannot begin a sync write transaction until it is finished. '
|
||||||
|
'Use asynchroneous transactions if you want to queue multiple write '
|
||||||
|
'transactions.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
final txn = beginTxnSync(write, silent);
|
||||||
|
_currentTxnSync = txn;
|
||||||
|
|
||||||
|
T result;
|
||||||
|
try {
|
||||||
|
result = callback();
|
||||||
|
txn.commitSync();
|
||||||
|
} catch (e) {
|
||||||
|
txn.abortSync();
|
||||||
|
rethrow;
|
||||||
|
} finally {
|
||||||
|
_currentTxnSync = null;
|
||||||
|
txn.free();
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
T txnSync<T>(T Function() callback) {
|
||||||
|
return _beginTxnSync(false, false, callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
T writeTxnSync<T>(T Function() callback, {bool silent = false}) {
|
||||||
|
return _beginTxnSync(true, silent, callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
R getTxnSync<R, T extends Transaction>(
|
||||||
|
bool write,
|
||||||
|
R Function(T txn) callback,
|
||||||
|
) {
|
||||||
|
if (_currentTxnSync != null) {
|
||||||
|
if (write && !_currentTxnSync!.write) {
|
||||||
|
throw IsarError(
|
||||||
|
'Operation cannot be performed within a read transaction. '
|
||||||
|
'Use isar.writeTxnSync() instead.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return callback(_currentTxnSync! as T);
|
||||||
|
} else if (!write) {
|
||||||
|
return _beginTxnSync(false, false, () => callback(_currentTxnSync! as T));
|
||||||
|
} else {
|
||||||
|
throw IsarError('Write operations require an explicit transaction. '
|
||||||
|
'Wrap your code in isar.writeTxnSync()');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<bool> close({bool deleteFromDisk = false}) async {
|
||||||
|
requireOpen();
|
||||||
|
_requireNotInTxn();
|
||||||
|
await Future.wait(_activeAsyncTxns);
|
||||||
|
await super.close();
|
||||||
|
|
||||||
|
return performClose(deleteFromDisk);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
bool performClose(bool deleteFromDisk);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
abstract class Transaction {
|
||||||
|
/// @nodoc
|
||||||
|
Transaction(this.isar, this.sync, this.write);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final Isar isar;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final bool sync;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final bool write;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
bool get active;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Future<void> commit();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
void commitSync();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Future<void> abort();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
void abortSync();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
void free() {}
|
||||||
|
}
|
108
lib/src/common/isar_link_base_impl.dart
Normal file
108
lib/src/common/isar_link_base_impl.dart
Normal file
@ -0,0 +1,108 @@
|
|||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
abstract class IsarLinkBaseImpl<OBJ> implements IsarLinkBase<OBJ> {
|
||||||
|
var _initialized = false;
|
||||||
|
|
||||||
|
Id? _objectId;
|
||||||
|
|
||||||
|
/// The isar name of the link
|
||||||
|
late final String linkName;
|
||||||
|
|
||||||
|
/// The origin collection of the link. For backlinks it is actually the target
|
||||||
|
/// collection.
|
||||||
|
late final IsarCollection<dynamic> sourceCollection;
|
||||||
|
|
||||||
|
/// The target collection of the link. For backlinks it is actually the origin
|
||||||
|
/// collection.
|
||||||
|
late final IsarCollection<OBJ> targetCollection;
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool get isAttached => _objectId != null;
|
||||||
|
|
||||||
|
@override
|
||||||
|
void attach(
|
||||||
|
IsarCollection<dynamic> sourceCollection,
|
||||||
|
IsarCollection<OBJ> targetCollection,
|
||||||
|
String linkName,
|
||||||
|
Id? objectId,
|
||||||
|
) {
|
||||||
|
if (_initialized) {
|
||||||
|
if (linkName != this.linkName ||
|
||||||
|
!identical(sourceCollection, this.sourceCollection) ||
|
||||||
|
!identical(targetCollection, this.targetCollection)) {
|
||||||
|
throw IsarError(
|
||||||
|
'Link has been moved! It is not allowed to move '
|
||||||
|
'a link to a different collection.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
_initialized = true;
|
||||||
|
this.sourceCollection = sourceCollection;
|
||||||
|
this.targetCollection = targetCollection;
|
||||||
|
this.linkName = linkName;
|
||||||
|
}
|
||||||
|
|
||||||
|
_objectId = objectId;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Returns the containing object's id or throws an exception if this link has
|
||||||
|
/// not been attached to an object yet.
|
||||||
|
Id requireAttached() {
|
||||||
|
if (_objectId == null) {
|
||||||
|
throw IsarError(
|
||||||
|
'Containing object needs to be managed by Isar to use this method. '
|
||||||
|
'Use collection.put(yourObject) to add it to the database.',
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
return _objectId!;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Returns the id of a linked object.
|
||||||
|
Id Function(OBJ obj) get getId;
|
||||||
|
|
||||||
|
/// Returns the id of a linked object or throws an exception if the id is
|
||||||
|
/// `null` or set to `Isar.autoIncrement`.
|
||||||
|
Id requireGetId(OBJ object) {
|
||||||
|
final id = getId(object);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
return id;
|
||||||
|
} else {
|
||||||
|
throw IsarError(
|
||||||
|
'Object "$object" has no id and can therefore not be linked. '
|
||||||
|
'Make sure to .put() objects before you use them in links.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// See [IsarLinks.filter].
|
||||||
|
QueryBuilder<OBJ, OBJ, QAfterFilterCondition> filter() {
|
||||||
|
final containingId = requireAttached();
|
||||||
|
final qb = QueryBuilderInternal(
|
||||||
|
collection: targetCollection,
|
||||||
|
whereClauses: [
|
||||||
|
LinkWhereClause(
|
||||||
|
linkCollection: sourceCollection.name,
|
||||||
|
linkName: linkName,
|
||||||
|
id: containingId,
|
||||||
|
),
|
||||||
|
],
|
||||||
|
);
|
||||||
|
return QueryBuilder(qb);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// See [IsarLinks.update].
|
||||||
|
Future<void> update({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// See [IsarLinks.updateSync].
|
||||||
|
void updateSync({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
});
|
||||||
|
}
|
92
lib/src/common/isar_link_common.dart
Normal file
92
lib/src/common/isar_link_common.dart
Normal file
@ -0,0 +1,92 @@
|
|||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/common/isar_link_base_impl.dart';
|
||||||
|
|
||||||
|
const bool _kIsWeb = identical(0, 0.0);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
abstract class IsarLinkCommon<OBJ> extends IsarLinkBaseImpl<OBJ>
|
||||||
|
with IsarLink<OBJ> {
|
||||||
|
OBJ? _value;
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool isChanged = false;
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool isLoaded = false;
|
||||||
|
|
||||||
|
@override
|
||||||
|
OBJ? get value {
|
||||||
|
if (isAttached && !isLoaded && !isChanged && !_kIsWeb) {
|
||||||
|
loadSync();
|
||||||
|
}
|
||||||
|
return _value;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
set value(OBJ? value) {
|
||||||
|
isChanged |= !identical(_value, value);
|
||||||
|
_value = value;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> load() async {
|
||||||
|
_value = await filter().findFirst();
|
||||||
|
isChanged = false;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void loadSync() {
|
||||||
|
_value = filter().findFirstSync();
|
||||||
|
isChanged = false;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> save() async {
|
||||||
|
if (!isChanged) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
final object = value;
|
||||||
|
|
||||||
|
await update(link: [if (object != null) object], reset: true);
|
||||||
|
isChanged = false;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void saveSync() {
|
||||||
|
if (!isChanged) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
final object = _value;
|
||||||
|
updateSync(link: [if (object != null) object], reset: true);
|
||||||
|
|
||||||
|
isChanged = false;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> reset() async {
|
||||||
|
await update(reset: true);
|
||||||
|
_value = null;
|
||||||
|
isChanged = false;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void resetSync() {
|
||||||
|
updateSync(reset: true);
|
||||||
|
_value = null;
|
||||||
|
isChanged = false;
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
String toString() {
|
||||||
|
return 'IsarLink($_value)';
|
||||||
|
}
|
||||||
|
}
|
223
lib/src/common/isar_links_common.dart
Normal file
223
lib/src/common/isar_links_common.dart
Normal file
@ -0,0 +1,223 @@
|
|||||||
|
import 'dart:collection';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/common/isar_link_base_impl.dart';
|
||||||
|
|
||||||
|
const bool _kIsWeb = identical(0, 0.0);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
abstract class IsarLinksCommon<OBJ> extends IsarLinkBaseImpl<OBJ>
|
||||||
|
with IsarLinks<OBJ>, SetMixin<OBJ> {
|
||||||
|
final _objects = <Id, OBJ>{};
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final addedObjects = HashSet<OBJ>.identity();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final removedObjects = HashSet<OBJ>.identity();
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool isLoaded = false;
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool get isChanged => addedObjects.isNotEmpty || removedObjects.isNotEmpty;
|
||||||
|
|
||||||
|
Map<Id, OBJ> get _loadedObjects {
|
||||||
|
if (isAttached && !isLoaded && !_kIsWeb) {
|
||||||
|
loadSync();
|
||||||
|
}
|
||||||
|
return _objects;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void attach(
|
||||||
|
IsarCollection<dynamic> sourceCollection,
|
||||||
|
IsarCollection<OBJ> targetCollection,
|
||||||
|
String linkName,
|
||||||
|
Id? objectId,
|
||||||
|
) {
|
||||||
|
super.attach(sourceCollection, targetCollection, linkName, objectId);
|
||||||
|
|
||||||
|
_applyAddedRemoved();
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> load({bool overrideChanges = false}) async {
|
||||||
|
final objects = await filter().findAll();
|
||||||
|
_applyLoaded(objects, overrideChanges);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void loadSync({bool overrideChanges = false}) {
|
||||||
|
final objects = filter().findAllSync();
|
||||||
|
_applyLoaded(objects, overrideChanges);
|
||||||
|
}
|
||||||
|
|
||||||
|
void _applyLoaded(List<OBJ> objects, bool overrideChanges) {
|
||||||
|
_objects.clear();
|
||||||
|
for (final object in objects) {
|
||||||
|
final id = getId(object);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
_objects[id] = object;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (overrideChanges) {
|
||||||
|
addedObjects.clear();
|
||||||
|
removedObjects.clear();
|
||||||
|
} else {
|
||||||
|
_applyAddedRemoved();
|
||||||
|
}
|
||||||
|
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
void _applyAddedRemoved() {
|
||||||
|
for (final object in addedObjects) {
|
||||||
|
final id = getId(object);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
_objects[id] = object;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for (final object in removedObjects) {
|
||||||
|
final id = getId(object);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
_objects.remove(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> save() async {
|
||||||
|
if (!isChanged) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
await update(link: addedObjects, unlink: removedObjects);
|
||||||
|
|
||||||
|
addedObjects.clear();
|
||||||
|
removedObjects.clear();
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void saveSync() {
|
||||||
|
if (!isChanged) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
updateSync(link: addedObjects, unlink: removedObjects);
|
||||||
|
|
||||||
|
addedObjects.clear();
|
||||||
|
removedObjects.clear();
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> reset() async {
|
||||||
|
await update(reset: true);
|
||||||
|
clear();
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void resetSync() {
|
||||||
|
updateSync(reset: true);
|
||||||
|
clear();
|
||||||
|
isLoaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool add(OBJ value) {
|
||||||
|
if (isAttached) {
|
||||||
|
final id = getId(value);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
if (_objects.containsKey(id)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
_objects[id] = value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
removedObjects.remove(value);
|
||||||
|
return addedObjects.add(value);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool contains(Object? element) {
|
||||||
|
requireAttached();
|
||||||
|
|
||||||
|
if (element is OBJ) {
|
||||||
|
final id = getId(element);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
return _loadedObjects.containsKey(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Iterator<OBJ> get iterator => _loadedObjects.values.iterator;
|
||||||
|
|
||||||
|
@override
|
||||||
|
int get length => _loadedObjects.length;
|
||||||
|
|
||||||
|
@override
|
||||||
|
OBJ? lookup(Object? element) {
|
||||||
|
requireAttached();
|
||||||
|
|
||||||
|
if (element is OBJ) {
|
||||||
|
final id = getId(element);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
return _loadedObjects[id];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool remove(Object? value) {
|
||||||
|
if (value is! OBJ) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isAttached) {
|
||||||
|
final id = getId(value);
|
||||||
|
if (id != Isar.autoIncrement) {
|
||||||
|
if (isLoaded && !_objects.containsKey(id)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
_objects.remove(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
addedObjects.remove(value);
|
||||||
|
return removedObjects.add(value);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Set<OBJ> toSet() {
|
||||||
|
requireAttached();
|
||||||
|
return HashSet(
|
||||||
|
equals: (o1, o2) => getId(o1) == getId(o2),
|
||||||
|
// ignore: noop_primitive_operations
|
||||||
|
hashCode: (o) => getId(o).toInt(),
|
||||||
|
isValidKey: (o) => o is OBJ && getId(o) != Isar.autoIncrement,
|
||||||
|
)..addAll(_loadedObjects.values);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void clear() {
|
||||||
|
_objects.clear();
|
||||||
|
addedObjects.clear();
|
||||||
|
removedObjects.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
String toString() {
|
||||||
|
final content =
|
||||||
|
IterableBase.iterableToFullString(_objects.values, '{', '}');
|
||||||
|
return 'IsarLinks($content)';
|
||||||
|
}
|
||||||
|
}
|
13
lib/src/common/schemas.dart
Normal file
13
lib/src/common/schemas.dart
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
List<Schema<dynamic>> getSchemas(
|
||||||
|
List<CollectionSchema<dynamic>> collectionSchemas,
|
||||||
|
) {
|
||||||
|
final schemas = <Schema<dynamic>>{};
|
||||||
|
for (final collectionSchema in collectionSchemas) {
|
||||||
|
schemas.add(collectionSchema);
|
||||||
|
schemas.addAll(collectionSchema.embeddedSchemas.values);
|
||||||
|
}
|
||||||
|
return schemas.toList();
|
||||||
|
}
|
347
lib/src/isar.dart
Normal file
347
lib/src/isar.dart
Normal file
@ -0,0 +1,347 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Callback for a newly opened Isar instance.
|
||||||
|
typedef IsarOpenCallback = void Function(Isar isar);
|
||||||
|
|
||||||
|
/// Callback for a release Isar instance.
|
||||||
|
typedef IsarCloseCallback = void Function(String isarName);
|
||||||
|
|
||||||
|
/// An instance of the Isar Database.
|
||||||
|
abstract class Isar {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Isar(this.name) {
|
||||||
|
_instances[name] = this;
|
||||||
|
for (final callback in _openCallbacks) {
|
||||||
|
callback(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// The version of the Isar library.
|
||||||
|
static const version = '3.1.0+1';
|
||||||
|
|
||||||
|
/// Smallest valid id.
|
||||||
|
static const Id minId = isarMinId;
|
||||||
|
|
||||||
|
/// Largest valid id.
|
||||||
|
static const Id maxId = isarMaxId;
|
||||||
|
|
||||||
|
/// The default Isar instance name.
|
||||||
|
static const String defaultName = 'default';
|
||||||
|
|
||||||
|
/// The default max Isar size.
|
||||||
|
static const int defaultMaxSizeMiB = 1024;
|
||||||
|
|
||||||
|
/// Placeholder for an auto-increment id.
|
||||||
|
static const Id autoIncrement = isarAutoIncrementId;
|
||||||
|
|
||||||
|
static final Map<String, Isar> _instances = <String, Isar>{};
|
||||||
|
static final Set<IsarOpenCallback> _openCallbacks = <IsarOpenCallback>{};
|
||||||
|
static final Set<IsarCloseCallback> _closeCallbacks = <IsarCloseCallback>{};
|
||||||
|
|
||||||
|
/// Name of the instance.
|
||||||
|
final String name;
|
||||||
|
|
||||||
|
/// The directory containing the database file or `null` on the web.
|
||||||
|
String? get directory;
|
||||||
|
|
||||||
|
/// The full path of the database file is `directory/name.isar` and the lock
|
||||||
|
/// file `directory/name.isar.lock`.
|
||||||
|
String? get path => directory != null ? '$directory/$name.isar' : null;
|
||||||
|
|
||||||
|
late final Map<Type, IsarCollection<dynamic>> _collections;
|
||||||
|
late final Map<String, IsarCollection<dynamic>> _collectionsByName;
|
||||||
|
|
||||||
|
bool _isOpen = true;
|
||||||
|
|
||||||
|
static void _checkOpen(String name, List<CollectionSchema<dynamic>> schemas) {
|
||||||
|
if (name.isEmpty || name.startsWith('_')) {
|
||||||
|
throw IsarError('Instance names must not be empty or start with "_".');
|
||||||
|
}
|
||||||
|
if (_instances.containsKey(name)) {
|
||||||
|
throw IsarError('Instance has already been opened.');
|
||||||
|
}
|
||||||
|
if (schemas.isEmpty) {
|
||||||
|
throw IsarError('At least one collection needs to be opened.');
|
||||||
|
}
|
||||||
|
|
||||||
|
final schemaNames = <String>{};
|
||||||
|
for (final schema in schemas) {
|
||||||
|
if (!schemaNames.add(schema.name)) {
|
||||||
|
throw IsarError('Duplicate collection ${schema.name}.');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for (final schema in schemas) {
|
||||||
|
final dependencies = schema.links.values.map((e) => e.target);
|
||||||
|
for (final dependency in dependencies) {
|
||||||
|
if (!schemaNames.contains(dependency)) {
|
||||||
|
throw IsarError(
|
||||||
|
"Collection ${schema.name} depends on $dependency but it's schema "
|
||||||
|
'was not provided.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Open a new Isar instance.
|
||||||
|
static Future<Isar> open(
|
||||||
|
List<CollectionSchema<dynamic>> schemas, {
|
||||||
|
required String directory,
|
||||||
|
String name = defaultName,
|
||||||
|
int maxSizeMiB = Isar.defaultMaxSizeMiB,
|
||||||
|
bool relaxedDurability = true,
|
||||||
|
CompactCondition? compactOnLaunch,
|
||||||
|
bool inspector = true,
|
||||||
|
}) {
|
||||||
|
_checkOpen(name, schemas);
|
||||||
|
|
||||||
|
/// Tree shake the inspector for profile and release builds.
|
||||||
|
assert(() {
|
||||||
|
if (!_kIsWeb && inspector) {
|
||||||
|
_IsarConnect.initialize(schemas);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}());
|
||||||
|
|
||||||
|
return openIsar(
|
||||||
|
schemas: schemas,
|
||||||
|
directory: directory,
|
||||||
|
name: name,
|
||||||
|
maxSizeMiB: maxSizeMiB,
|
||||||
|
relaxedDurability: relaxedDurability,
|
||||||
|
compactOnLaunch: compactOnLaunch,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Open a new Isar instance.
|
||||||
|
static Isar openSync(
|
||||||
|
List<CollectionSchema<dynamic>> schemas, {
|
||||||
|
required String directory,
|
||||||
|
String name = defaultName,
|
||||||
|
int maxSizeMiB = Isar.defaultMaxSizeMiB,
|
||||||
|
bool relaxedDurability = true,
|
||||||
|
CompactCondition? compactOnLaunch,
|
||||||
|
bool inspector = true,
|
||||||
|
}) {
|
||||||
|
_checkOpen(name, schemas);
|
||||||
|
|
||||||
|
/// Tree shake the inspector for profile and release builds.
|
||||||
|
assert(() {
|
||||||
|
if (!_kIsWeb && inspector) {
|
||||||
|
_IsarConnect.initialize(schemas);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}());
|
||||||
|
|
||||||
|
return openIsarSync(
|
||||||
|
schemas: schemas,
|
||||||
|
directory: directory,
|
||||||
|
name: name,
|
||||||
|
maxSizeMiB: maxSizeMiB,
|
||||||
|
relaxedDurability: relaxedDurability,
|
||||||
|
compactOnLaunch: compactOnLaunch,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Is the instance open?
|
||||||
|
bool get isOpen => _isOpen;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
void requireOpen() {
|
||||||
|
if (!isOpen) {
|
||||||
|
throw IsarError('Isar instance has already been closed');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Executes an asynchronous read-only transaction.
|
||||||
|
Future<T> txn<T>(Future<T> Function() callback);
|
||||||
|
|
||||||
|
/// Executes an asynchronous read-write transaction.
|
||||||
|
///
|
||||||
|
/// If [silent] is `true`, watchers are not notified about changes in this
|
||||||
|
/// transaction.
|
||||||
|
Future<T> writeTxn<T>(Future<T> Function() callback, {bool silent = false});
|
||||||
|
|
||||||
|
/// Executes a synchronous read-only transaction.
|
||||||
|
T txnSync<T>(T Function() callback);
|
||||||
|
|
||||||
|
/// Executes a synchronous read-write transaction.
|
||||||
|
///
|
||||||
|
/// If [silent] is `true`, watchers are not notified about changes in this
|
||||||
|
/// transaction.
|
||||||
|
T writeTxnSync<T>(T Function() callback, {bool silent = false});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
void attachCollections(Map<Type, IsarCollection<dynamic>> collections) {
|
||||||
|
_collections = collections;
|
||||||
|
_collectionsByName = {
|
||||||
|
for (IsarCollection<dynamic> col in collections.values) col.name: col,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get a collection by its type.
|
||||||
|
///
|
||||||
|
/// You should use the generated extension methods instead.
|
||||||
|
IsarCollection<T> collection<T>() {
|
||||||
|
requireOpen();
|
||||||
|
final collection = _collections[T];
|
||||||
|
if (collection == null) {
|
||||||
|
throw IsarError('Missing ${T.runtimeType}Schema in Isar.open');
|
||||||
|
}
|
||||||
|
return collection as IsarCollection<T>;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
IsarCollection<dynamic>? getCollectionByNameInternal(String name) {
|
||||||
|
return _collectionsByName[name];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Remove all data in this instance and reset the auto increment values.
|
||||||
|
Future<void> clear() async {
|
||||||
|
for (final col in _collections.values) {
|
||||||
|
await col.clear();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Remove all data in this instance and reset the auto increment values.
|
||||||
|
void clearSync() {
|
||||||
|
for (final col in _collections.values) {
|
||||||
|
col.clearSync();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Returns the size of all the collections in bytes. Not supported on web.
|
||||||
|
///
|
||||||
|
/// This method is extremely fast and independent of the number of objects in
|
||||||
|
/// the instance.
|
||||||
|
Future<int> getSize({bool includeIndexes = false, bool includeLinks = false});
|
||||||
|
|
||||||
|
/// Returns the size of all collections in bytes. Not supported on web.
|
||||||
|
///
|
||||||
|
/// This method is extremely fast and independent of the number of objects in
|
||||||
|
/// the instance.
|
||||||
|
int getSizeSync({bool includeIndexes = false, bool includeLinks = false});
|
||||||
|
|
||||||
|
/// Copy a compacted version of the database to the specified file.
|
||||||
|
///
|
||||||
|
/// If you want to backup your database, you should always use a compacted
|
||||||
|
/// version. Compacted does not mean compressed.
|
||||||
|
///
|
||||||
|
/// Do not run this method while other transactions are active to avoid
|
||||||
|
/// unnecessary growth of the database.
|
||||||
|
Future<void> copyToFile(String targetPath);
|
||||||
|
|
||||||
|
/// Releases an Isar instance.
|
||||||
|
///
|
||||||
|
/// If this is the only isolate that holds a reference to this instance, the
|
||||||
|
/// Isar instance will be closed. [deleteFromDisk] additionally removes all
|
||||||
|
/// database files if enabled.
|
||||||
|
///
|
||||||
|
/// Returns whether the instance was actually closed.
|
||||||
|
Future<bool> close({bool deleteFromDisk = false}) {
|
||||||
|
requireOpen();
|
||||||
|
_isOpen = false;
|
||||||
|
if (identical(_instances[name], this)) {
|
||||||
|
_instances.remove(name);
|
||||||
|
}
|
||||||
|
for (final callback in _closeCallbacks) {
|
||||||
|
callback(name);
|
||||||
|
}
|
||||||
|
return Future.value(false);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Verifies the integrity of the database file.
|
||||||
|
///
|
||||||
|
/// Do not use this method in production apps.
|
||||||
|
@visibleForTesting
|
||||||
|
@experimental
|
||||||
|
Future<void> verify();
|
||||||
|
|
||||||
|
/// A list of all Isar instances opened in the current isolate.
|
||||||
|
static Set<String> get instanceNames => _instances.keys.toSet();
|
||||||
|
|
||||||
|
/// Returns an Isar instance opened in the current isolate by its name. If
|
||||||
|
/// no name is provided, the default instance is returned.
|
||||||
|
static Isar? getInstance([String name = defaultName]) {
|
||||||
|
return _instances[name];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Registers a listener that is called whenever an Isar instance is opened.
|
||||||
|
static void addOpenListener(IsarOpenCallback callback) {
|
||||||
|
_openCallbacks.add(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Removes a previously registered `IsarOpenCallback`.
|
||||||
|
static void removeOpenListener(IsarOpenCallback callback) {
|
||||||
|
_openCallbacks.remove(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Registers a listener that is called whenever an Isar instance is
|
||||||
|
/// released.
|
||||||
|
static void addCloseListener(IsarCloseCallback callback) {
|
||||||
|
_closeCallbacks.add(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Removes a previously registered `IsarOpenCallback`.
|
||||||
|
static void removeCloseListener(IsarCloseCallback callback) {
|
||||||
|
_closeCallbacks.remove(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Initialize Isar Core manually. You need to provide Isar Core libraries
|
||||||
|
/// for every platform your app will run on.
|
||||||
|
///
|
||||||
|
/// If [download] is `true`, Isar will attempt to download the correct
|
||||||
|
/// library and place it in the specified path or the script directory.
|
||||||
|
///
|
||||||
|
/// Be careful if multiple unit tests try to download the library at the
|
||||||
|
/// same time. Always use `flutter test -j 1` when you rely on auto
|
||||||
|
/// downloading to ensure that only one test is running at a time.
|
||||||
|
///
|
||||||
|
/// Only use this method for non-Flutter code or unit tests.
|
||||||
|
static Future<void> initializeIsarCore({
|
||||||
|
Map<IsarAbi, String> libraries = const {},
|
||||||
|
bool download = false,
|
||||||
|
}) async {
|
||||||
|
await initializeCoreBinary(
|
||||||
|
libraries: libraries,
|
||||||
|
download: download,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Split a String into words according to Unicode Annex #29. Only words
|
||||||
|
/// containing at least one alphanumeric character will be included.
|
||||||
|
static List<String> splitWords(String input) => isarSplitWords(input);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Isar databases can contain unused space that will be reused for later
|
||||||
|
/// operations. You can specify conditions to trigger manual compaction where
|
||||||
|
/// the entire database is copied and unused space freed.
|
||||||
|
///
|
||||||
|
/// This operation can only be performed while a database is being opened and
|
||||||
|
/// should only be used if absolutely necessary.
|
||||||
|
class CompactCondition {
|
||||||
|
/// Compaction will happen if all of the specified conditions are true.
|
||||||
|
const CompactCondition({
|
||||||
|
this.minFileSize,
|
||||||
|
this.minBytes,
|
||||||
|
this.minRatio,
|
||||||
|
}) : assert(
|
||||||
|
minFileSize != null || minBytes != null || minRatio != null,
|
||||||
|
'At least one condition needs to be specified.',
|
||||||
|
);
|
||||||
|
|
||||||
|
/// The minimum size in bytes of the database file to trigger compaction. It
|
||||||
|
/// is highly discouraged to trigger compaction solely on this condition.
|
||||||
|
final int? minFileSize;
|
||||||
|
|
||||||
|
/// The minimum number of bytes that can be freed with compaction.
|
||||||
|
final int? minBytes;
|
||||||
|
|
||||||
|
/// The minimum compaction ration. For example `2.0` would trigger compaction
|
||||||
|
/// as soon as the file size can be halved.
|
||||||
|
final double? minRatio;
|
||||||
|
}
|
342
lib/src/isar_collection.dart
Normal file
342
lib/src/isar_collection.dart
Normal file
@ -0,0 +1,342 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Normal keys consist of a single object, composite keys multiple.
|
||||||
|
typedef IndexKey = List<Object?>;
|
||||||
|
|
||||||
|
/// Use `IsarCollection` instances to find, query, and create new objects of a
|
||||||
|
/// given type in Isar.
|
||||||
|
///
|
||||||
|
/// You can get an instance of `IsarCollection` by calling `isar.get<OBJ>()` or
|
||||||
|
/// by using the generated `isar.yourCollections` getter.
|
||||||
|
abstract class IsarCollection<OBJ> {
|
||||||
|
/// The corresponding Isar instance.
|
||||||
|
Isar get isar;
|
||||||
|
|
||||||
|
/// Get the schema of the collection.
|
||||||
|
CollectionSchema<OBJ> get schema;
|
||||||
|
|
||||||
|
/// The name of the collection.
|
||||||
|
String get name => schema.name;
|
||||||
|
|
||||||
|
/// {@template col_get}
|
||||||
|
/// Get a single object by its [id] or `null` if the object does not exist.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<OBJ?> get(Id id) {
|
||||||
|
return getAll([id]).then((List<OBJ?> objects) => objects[0]);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro col_get}
|
||||||
|
OBJ? getSync(Id id) {
|
||||||
|
return getAllSync([id])[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@template col_get_all}
|
||||||
|
/// Get a list of objects by their [ids] or `null` if an object does not
|
||||||
|
/// exist.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<List<OBJ?>> getAll(List<Id> ids);
|
||||||
|
|
||||||
|
/// {@macro col_get_all}
|
||||||
|
List<OBJ?> getAllSync(List<Id> ids);
|
||||||
|
|
||||||
|
/// {@template col_get_by_index}
|
||||||
|
/// Get a single object by the unique index [indexName] and [key].
|
||||||
|
///
|
||||||
|
/// Returns `null` if the object does not exist.
|
||||||
|
///
|
||||||
|
/// If possible, you should use the generated type-safe methods instead.
|
||||||
|
/// {@endtemplate}
|
||||||
|
@experimental
|
||||||
|
Future<OBJ?> getByIndex(String indexName, IndexKey key) {
|
||||||
|
return getAllByIndex(indexName, [key])
|
||||||
|
.then((List<OBJ?> objects) => objects[0]);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro col_get_by_index}
|
||||||
|
@experimental
|
||||||
|
OBJ? getByIndexSync(String indexName, IndexKey key) {
|
||||||
|
return getAllByIndexSync(indexName, [key])[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@template col_get_all_by_index}
|
||||||
|
/// Get a list of objects by the unique index [indexName] and [keys].
|
||||||
|
///
|
||||||
|
/// Returns `null` if the object does not exist.
|
||||||
|
///
|
||||||
|
/// If possible, you should use the generated type-safe methods instead.
|
||||||
|
/// {@endtemplate}
|
||||||
|
@experimental
|
||||||
|
Future<List<OBJ?>> getAllByIndex(String indexName, List<IndexKey> keys);
|
||||||
|
|
||||||
|
/// {@macro col_get_all_by_index}'
|
||||||
|
@experimental
|
||||||
|
List<OBJ?> getAllByIndexSync(String indexName, List<IndexKey> keys);
|
||||||
|
|
||||||
|
/// {@template col_put}
|
||||||
|
/// Insert or update an [object]. Returns the id of the new or updated object.
|
||||||
|
///
|
||||||
|
/// If the object has an non-final id property, it will be set to the assigned
|
||||||
|
/// id. Otherwise you should use the returned id to update the object.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<Id> put(OBJ object) {
|
||||||
|
return putAll([object]).then((List<Id> ids) => ids[0]);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro col_put}
|
||||||
|
Id putSync(OBJ object, {bool saveLinks = true}) {
|
||||||
|
return putAllSync([object], saveLinks: saveLinks)[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@template col_put_all}
|
||||||
|
/// Insert or update a list of [objects]. Returns the list of ids of the new
|
||||||
|
/// or updated objects.
|
||||||
|
///
|
||||||
|
/// If the objects have an non-final id property, it will be set to the
|
||||||
|
/// assigned id. Otherwise you should use the returned ids to update the
|
||||||
|
/// objects.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<List<Id>> putAll(List<OBJ> objects);
|
||||||
|
|
||||||
|
/// {@macro col_put_all}
|
||||||
|
List<Id> putAllSync(List<OBJ> objects, {bool saveLinks = true});
|
||||||
|
|
||||||
|
/// {@template col_put_by_index}
|
||||||
|
/// Insert or update the [object] by the unique index [indexName]. Returns the
|
||||||
|
/// id of the new or updated object.
|
||||||
|
///
|
||||||
|
/// If there is already an object with the same index key, it will be
|
||||||
|
/// updated and all links will be preserved. Otherwise a new object will be
|
||||||
|
/// inserted.
|
||||||
|
///
|
||||||
|
/// If the object has an non-final id property, it will be set to the assigned
|
||||||
|
/// id. Otherwise you should use the returned id to update the object.
|
||||||
|
///
|
||||||
|
/// If possible, you should use the generated type-safe methods instead.
|
||||||
|
/// {@endtemplate}
|
||||||
|
@experimental
|
||||||
|
Future<Id> putByIndex(String indexName, OBJ object) {
|
||||||
|
return putAllByIndex(indexName, [object]).then((List<Id> ids) => ids[0]);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro col_put_by_index}
|
||||||
|
@experimental
|
||||||
|
Id putByIndexSync(String indexName, OBJ object, {bool saveLinks = true}) {
|
||||||
|
return putAllByIndexSync(indexName, [object])[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@template col_put_all_by_index}
|
||||||
|
/// Insert or update a list of [objects] by the unique index [indexName].
|
||||||
|
/// Returns the list of ids of the new or updated objects.
|
||||||
|
///
|
||||||
|
/// If there is already an object with the same index key, it will be
|
||||||
|
/// updated and all links will be preserved. Otherwise a new object will be
|
||||||
|
/// inserted.
|
||||||
|
///
|
||||||
|
/// If the objects have an non-final id property, it will be set to the
|
||||||
|
/// assigned id. Otherwise you should use the returned ids to update the
|
||||||
|
/// objects.
|
||||||
|
///
|
||||||
|
/// If possible, you should use the generated type-safe methods instead.
|
||||||
|
/// {@endtemplate}
|
||||||
|
@experimental
|
||||||
|
Future<List<Id>> putAllByIndex(String indexName, List<OBJ> objects);
|
||||||
|
|
||||||
|
/// {@macro col_put_all_by_index}
|
||||||
|
@experimental
|
||||||
|
List<Id> putAllByIndexSync(
|
||||||
|
String indexName,
|
||||||
|
List<OBJ> objects, {
|
||||||
|
bool saveLinks = true,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// {@template col_delete}
|
||||||
|
/// Delete a single object by its [id].
|
||||||
|
///
|
||||||
|
/// Returns whether the object has been deleted. Isar web always returns
|
||||||
|
/// `true`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<bool> delete(Id id) {
|
||||||
|
return deleteAll([id]).then((int count) => count == 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro col_delete}
|
||||||
|
bool deleteSync(Id id) {
|
||||||
|
return deleteAllSync([id]) == 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@template col_delete_all}
|
||||||
|
/// Delete a list of objects by their [ids].
|
||||||
|
///
|
||||||
|
/// Returns the number of objects that have been deleted. Isar web always
|
||||||
|
/// returns `ids.length`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<int> deleteAll(List<Id> ids);
|
||||||
|
|
||||||
|
/// {@macro col_delete_all}
|
||||||
|
int deleteAllSync(List<Id> ids);
|
||||||
|
|
||||||
|
/// {@template col_delete_by_index}
|
||||||
|
/// Delete a single object by the unique index [indexName] and [key].
|
||||||
|
///
|
||||||
|
/// Returns whether the object has been deleted. Isar web always returns
|
||||||
|
/// `true`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
@experimental
|
||||||
|
Future<bool> deleteByIndex(String indexName, IndexKey key) {
|
||||||
|
return deleteAllByIndex(indexName, [key]).then((int count) => count == 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro col_delete_by_index}
|
||||||
|
@experimental
|
||||||
|
bool deleteByIndexSync(String indexName, IndexKey key) {
|
||||||
|
return deleteAllByIndexSync(indexName, [key]) == 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@template col_delete_all_by_index}
|
||||||
|
/// Delete a list of objects by the unique index [indexName] and [keys].
|
||||||
|
///
|
||||||
|
/// Returns the number of objects that have been deleted. Isar web always
|
||||||
|
/// returns `keys.length`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
@experimental
|
||||||
|
Future<int> deleteAllByIndex(String indexName, List<IndexKey> keys);
|
||||||
|
|
||||||
|
/// {@macro col_delete_all_by_index}
|
||||||
|
@experimental
|
||||||
|
int deleteAllByIndexSync(String indexName, List<IndexKey> keys);
|
||||||
|
|
||||||
|
/// {@template col_clear}
|
||||||
|
/// Remove all data in this collection and reset the auto increment value.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> clear();
|
||||||
|
|
||||||
|
/// {@macro col_clear}
|
||||||
|
void clearSync();
|
||||||
|
|
||||||
|
/// {@template col_import_json_raw}
|
||||||
|
/// Import a list of json objects encoded as a byte array.
|
||||||
|
///
|
||||||
|
/// The json objects must have the same structure as the objects in this
|
||||||
|
/// collection. Otherwise an exception will be thrown.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> importJsonRaw(Uint8List jsonBytes);
|
||||||
|
|
||||||
|
/// {@macro col_import_json_raw}
|
||||||
|
void importJsonRawSync(Uint8List jsonBytes);
|
||||||
|
|
||||||
|
/// {@template col_import_json}
|
||||||
|
/// Import a list of json objects.
|
||||||
|
///
|
||||||
|
/// The json objects must have the same structure as the objects in this
|
||||||
|
/// collection. Otherwise an exception will be thrown.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> importJson(List<Map<String, dynamic>> json);
|
||||||
|
|
||||||
|
/// {@macro col_import_json}
|
||||||
|
void importJsonSync(List<Map<String, dynamic>> json);
|
||||||
|
|
||||||
|
/// Start building a query using the [QueryBuilder].
|
||||||
|
///
|
||||||
|
/// You can use where clauses to only return [distinct] results. If you want
|
||||||
|
/// to reverse the order, set [sort] to [Sort.desc].
|
||||||
|
QueryBuilder<OBJ, OBJ, QWhere> where({
|
||||||
|
bool distinct = false,
|
||||||
|
Sort sort = Sort.asc,
|
||||||
|
}) {
|
||||||
|
final qb = QueryBuilderInternal(
|
||||||
|
collection: this,
|
||||||
|
whereDistinct: distinct,
|
||||||
|
whereSort: sort,
|
||||||
|
);
|
||||||
|
return QueryBuilder(qb);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Start building a query using the [QueryBuilder].
|
||||||
|
///
|
||||||
|
/// Shortcut if you don't want to use where clauses.
|
||||||
|
QueryBuilder<OBJ, OBJ, QFilterCondition> filter() => where().filter();
|
||||||
|
|
||||||
|
/// Build a query dynamically for example to build a custom query language.
|
||||||
|
///
|
||||||
|
/// It is highly discouraged to use this method. Only in very special cases
|
||||||
|
/// should it be used. If you open an issue please always mention that you
|
||||||
|
/// used this method.
|
||||||
|
///
|
||||||
|
/// The type argument [R] needs to be equal to [OBJ] if no [property] is
|
||||||
|
/// specified. Otherwise it should be the type of the property.
|
||||||
|
@experimental
|
||||||
|
Query<R> buildQuery<R>({
|
||||||
|
List<WhereClause> whereClauses = const [],
|
||||||
|
bool whereDistinct = false,
|
||||||
|
Sort whereSort = Sort.asc,
|
||||||
|
FilterOperation? filter,
|
||||||
|
List<SortProperty> sortBy = const [],
|
||||||
|
List<DistinctProperty> distinctBy = const [],
|
||||||
|
int? offset,
|
||||||
|
int? limit,
|
||||||
|
String? property,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// {@template col_count}
|
||||||
|
/// Returns the total number of objects in this collection.
|
||||||
|
///
|
||||||
|
/// For non-web apps, this method is extremely fast and independent of the
|
||||||
|
/// number of objects in the collection.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<int> count();
|
||||||
|
|
||||||
|
/// {@macro col_count}
|
||||||
|
int countSync();
|
||||||
|
|
||||||
|
/// {@template col_size}
|
||||||
|
/// Returns the size of the collection in bytes. Not supported on web.
|
||||||
|
///
|
||||||
|
/// For non-web apps, this method is extremely fast and independent of the
|
||||||
|
/// number of objects in the collection.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<int> getSize({bool includeIndexes = false, bool includeLinks = false});
|
||||||
|
|
||||||
|
/// {@macro col_size}
|
||||||
|
int getSizeSync({bool includeIndexes = false, bool includeLinks = false});
|
||||||
|
|
||||||
|
/// Watch the collection for changes.
|
||||||
|
///
|
||||||
|
/// If [fireImmediately] is `true`, an event will be fired immediately.
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false});
|
||||||
|
|
||||||
|
/// Watch the object with [id] for changes. If a change occurs, the new object
|
||||||
|
/// will be returned in the stream.
|
||||||
|
///
|
||||||
|
/// Objects that don't exist (yet) can also be watched. If [fireImmediately]
|
||||||
|
/// is `true`, the object will be sent to the consumer immediately.
|
||||||
|
Stream<OBJ?> watchObject(Id id, {bool fireImmediately = false});
|
||||||
|
|
||||||
|
/// Watch the object with [id] for changes.
|
||||||
|
///
|
||||||
|
/// If [fireImmediately] is `true`, an event will be fired immediately.
|
||||||
|
Stream<void> watchObjectLazy(Id id, {bool fireImmediately = false});
|
||||||
|
|
||||||
|
/// Verifies the integrity of the collection and its indexes.
|
||||||
|
///
|
||||||
|
/// Throws an exception if the collection does not contain exactly the
|
||||||
|
/// provided [objects].
|
||||||
|
///
|
||||||
|
/// Do not use this method in production apps.
|
||||||
|
@visibleForTesting
|
||||||
|
@experimental
|
||||||
|
Future<void> verify(List<OBJ> objects);
|
||||||
|
|
||||||
|
/// Verifies the integrity of a link.
|
||||||
|
///
|
||||||
|
/// Throws an exception if not exactly [sourceIds] as linked to the
|
||||||
|
/// [targetIds].
|
||||||
|
///
|
||||||
|
/// Do not use this method in production apps.
|
||||||
|
@visibleForTesting
|
||||||
|
@experimental
|
||||||
|
Future<void> verifyLink(
|
||||||
|
String linkName,
|
||||||
|
List<int> sourceIds,
|
||||||
|
List<int> targetIds,
|
||||||
|
);
|
||||||
|
}
|
263
lib/src/isar_connect.dart
Normal file
263
lib/src/isar_connect.dart
Normal file
@ -0,0 +1,263 @@
|
|||||||
|
// coverage:ignore-file
|
||||||
|
// ignore_for_file: avoid_print
|
||||||
|
|
||||||
|
part of isar;
|
||||||
|
|
||||||
|
abstract class _IsarConnect {
|
||||||
|
static const Map<ConnectAction,
|
||||||
|
Future<dynamic> Function(Map<String, dynamic> _)> _handlers = {
|
||||||
|
ConnectAction.getSchema: _getSchema,
|
||||||
|
ConnectAction.listInstances: _listInstances,
|
||||||
|
ConnectAction.watchInstance: _watchInstance,
|
||||||
|
ConnectAction.executeQuery: _executeQuery,
|
||||||
|
ConnectAction.removeQuery: _removeQuery,
|
||||||
|
ConnectAction.importJson: _importJson,
|
||||||
|
ConnectAction.exportJson: _exportJson,
|
||||||
|
ConnectAction.editProperty: _editProperty,
|
||||||
|
};
|
||||||
|
|
||||||
|
static List<CollectionSchema<dynamic>>? _schemas;
|
||||||
|
|
||||||
|
// ignore: cancel_subscriptions
|
||||||
|
static final _querySubscription = <StreamSubscription<void>>[];
|
||||||
|
static final List<StreamSubscription<void>> _collectionSubscriptions =
|
||||||
|
<StreamSubscription<void>>[];
|
||||||
|
|
||||||
|
static void initialize(List<CollectionSchema<dynamic>> schemas) {
|
||||||
|
if (_schemas != null) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
_schemas = schemas;
|
||||||
|
|
||||||
|
Isar.addOpenListener((_) {
|
||||||
|
postEvent(ConnectEvent.instancesChanged.event, {});
|
||||||
|
});
|
||||||
|
|
||||||
|
Isar.addCloseListener((_) {
|
||||||
|
postEvent(ConnectEvent.instancesChanged.event, {});
|
||||||
|
});
|
||||||
|
|
||||||
|
for (final handler in _handlers.entries) {
|
||||||
|
registerExtension(handler.key.method,
|
||||||
|
(String method, Map<String, String> parameters) async {
|
||||||
|
try {
|
||||||
|
final args = parameters.containsKey('args')
|
||||||
|
? jsonDecode(parameters['args']!) as Map<String, dynamic>
|
||||||
|
: <String, dynamic>{};
|
||||||
|
final result = <String, dynamic>{'result': await handler.value(args)};
|
||||||
|
return ServiceExtensionResponse.result(jsonEncode(result));
|
||||||
|
} catch (e) {
|
||||||
|
return ServiceExtensionResponse.error(
|
||||||
|
ServiceExtensionResponse.extensionError,
|
||||||
|
e.toString(),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
_printConnection();
|
||||||
|
}
|
||||||
|
|
||||||
|
static void _printConnection() {
|
||||||
|
Service.getInfo().then((ServiceProtocolInfo info) {
|
||||||
|
final serviceUri = info.serverUri;
|
||||||
|
if (serviceUri == null) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
final port = serviceUri.port;
|
||||||
|
var path = serviceUri.path;
|
||||||
|
if (path.endsWith('/')) {
|
||||||
|
path = path.substring(0, path.length - 1);
|
||||||
|
}
|
||||||
|
if (path.endsWith('=')) {
|
||||||
|
path = path.substring(0, path.length - 1);
|
||||||
|
}
|
||||||
|
final url = ' https://inspect.isar.dev/${Isar.version}/#/$port$path ';
|
||||||
|
String line(String text, String fill) {
|
||||||
|
final fillCount = url.length - text.length;
|
||||||
|
final left = List.filled(fillCount ~/ 2, fill);
|
||||||
|
final right = List.filled(fillCount - left.length, fill);
|
||||||
|
return left.join() + text + right.join();
|
||||||
|
}
|
||||||
|
|
||||||
|
print('╔${line('', '═')}╗');
|
||||||
|
print('║${line('ISAR CONNECT STARTED', ' ')}║');
|
||||||
|
print('╟${line('', '─')}╢');
|
||||||
|
print('║${line('Open the link to connect to the Isar', ' ')}║');
|
||||||
|
print('║${line('Inspector while this build is running.', ' ')}║');
|
||||||
|
print('╟${line('', '─')}╢');
|
||||||
|
print('║$url║');
|
||||||
|
print('╚${line('', '═')}╝');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<dynamic> _getSchema(Map<String, dynamic> _) async {
|
||||||
|
return _schemas!.map((e) => e.toJson()).toList();
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<dynamic> _listInstances(Map<String, dynamic> _) async {
|
||||||
|
return Isar.instanceNames.toList();
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<bool> _watchInstance(Map<String, dynamic> params) async {
|
||||||
|
for (final sub in _collectionSubscriptions) {
|
||||||
|
unawaited(sub.cancel());
|
||||||
|
}
|
||||||
|
|
||||||
|
_collectionSubscriptions.clear();
|
||||||
|
if (params.isEmpty) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
final instanceName = params['instance'] as String;
|
||||||
|
final instance = Isar.getInstance(instanceName)!;
|
||||||
|
|
||||||
|
for (final collection in instance._collections.values) {
|
||||||
|
final sub = collection.watchLazy(fireImmediately: true).listen((_) {
|
||||||
|
_sendCollectionInfo(collection);
|
||||||
|
});
|
||||||
|
_collectionSubscriptions.add(sub);
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
static void _sendCollectionInfo(IsarCollection<dynamic> collection) {
|
||||||
|
final count = collection.countSync();
|
||||||
|
final size = collection.getSizeSync(
|
||||||
|
includeIndexes: true,
|
||||||
|
includeLinks: true,
|
||||||
|
);
|
||||||
|
final collectionInfo = ConnectCollectionInfo(
|
||||||
|
instance: collection.isar.name,
|
||||||
|
collection: collection.name,
|
||||||
|
size: size,
|
||||||
|
count: count,
|
||||||
|
);
|
||||||
|
postEvent(
|
||||||
|
ConnectEvent.collectionInfoChanged.event,
|
||||||
|
collectionInfo.toJson(),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<Map<String, dynamic>> _executeQuery(
|
||||||
|
Map<String, dynamic> params,
|
||||||
|
) async {
|
||||||
|
for (final sub in _querySubscription) {
|
||||||
|
unawaited(sub.cancel());
|
||||||
|
}
|
||||||
|
_querySubscription.clear();
|
||||||
|
|
||||||
|
final cQuery = ConnectQuery.fromJson(params);
|
||||||
|
final instance = Isar.getInstance(cQuery.instance)!;
|
||||||
|
|
||||||
|
final links =
|
||||||
|
_schemas!.firstWhere((e) => e.name == cQuery.collection).links.values;
|
||||||
|
|
||||||
|
final query = cQuery.toQuery();
|
||||||
|
params.remove('limit');
|
||||||
|
params.remove('offset');
|
||||||
|
final countQuery = ConnectQuery.fromJson(params).toQuery();
|
||||||
|
|
||||||
|
_querySubscription.add(
|
||||||
|
query.watchLazy().listen((_) {
|
||||||
|
postEvent(ConnectEvent.queryChanged.event, {});
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
final subscribed = {cQuery.collection};
|
||||||
|
for (final link in links) {
|
||||||
|
if (subscribed.add(link.target)) {
|
||||||
|
final target = instance.getCollectionByNameInternal(link.target)!;
|
||||||
|
_querySubscription.add(
|
||||||
|
target.watchLazy().listen((_) {
|
||||||
|
postEvent(ConnectEvent.queryChanged.event, {});
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final objects = await query.exportJson();
|
||||||
|
if (links.isNotEmpty) {
|
||||||
|
final source = instance.getCollectionByNameInternal(cQuery.collection)!;
|
||||||
|
for (final object in objects) {
|
||||||
|
for (final link in links) {
|
||||||
|
final target = instance.getCollectionByNameInternal(link.target)!;
|
||||||
|
final links = await target.buildQuery<dynamic>(
|
||||||
|
whereClauses: [
|
||||||
|
LinkWhereClause(
|
||||||
|
linkCollection: source.name,
|
||||||
|
linkName: link.name,
|
||||||
|
id: object[source.schema.idName] as int,
|
||||||
|
),
|
||||||
|
],
|
||||||
|
limit: link.single ? 1 : null,
|
||||||
|
).exportJson();
|
||||||
|
|
||||||
|
if (link.single) {
|
||||||
|
object[link.name] = links.isEmpty ? null : links.first;
|
||||||
|
} else {
|
||||||
|
object[link.name] = links;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
'objects': objects,
|
||||||
|
'count': await countQuery.count(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<bool> _removeQuery(Map<String, dynamic> params) async {
|
||||||
|
final query = ConnectQuery.fromJson(params).toQuery();
|
||||||
|
await query.isar.writeTxn(query.deleteAll);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<void> _importJson(Map<String, dynamic> params) async {
|
||||||
|
final instance = Isar.getInstance(params['instance'] as String)!;
|
||||||
|
final collection =
|
||||||
|
instance.getCollectionByNameInternal(params['collection'] as String)!;
|
||||||
|
final objects = (params['objects'] as List).cast<Map<String, dynamic>>();
|
||||||
|
await instance.writeTxn(() async {
|
||||||
|
await collection.importJson(objects);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<List<dynamic>> _exportJson(Map<String, dynamic> params) async {
|
||||||
|
final query = ConnectQuery.fromJson(params).toQuery();
|
||||||
|
return query.exportJson();
|
||||||
|
}
|
||||||
|
|
||||||
|
static Future<void> _editProperty(Map<String, dynamic> params) async {
|
||||||
|
final cEdit = ConnectEdit.fromJson(params);
|
||||||
|
final isar = Isar.getInstance(cEdit.instance)!;
|
||||||
|
final collection = isar.getCollectionByNameInternal(cEdit.collection)!;
|
||||||
|
final keys = cEdit.path.split('.');
|
||||||
|
|
||||||
|
final query = collection.buildQuery<dynamic>(
|
||||||
|
whereClauses: [IdWhereClause.equalTo(value: cEdit.id)],
|
||||||
|
);
|
||||||
|
|
||||||
|
final objects = await query.exportJson();
|
||||||
|
if (objects.isNotEmpty) {
|
||||||
|
dynamic object = objects.first;
|
||||||
|
for (var i = 0; i < keys.length; i++) {
|
||||||
|
if (i == keys.length - 1 && object is Map) {
|
||||||
|
object[keys[i]] = cEdit.value;
|
||||||
|
} else if (object is Map) {
|
||||||
|
object = object[keys[i]];
|
||||||
|
} else if (object is List) {
|
||||||
|
object = object[int.parse(keys[i])];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
await isar.writeTxn(() async {
|
||||||
|
await collection.importJson(objects);
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
print(e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
215
lib/src/isar_connect_api.dart
Normal file
215
lib/src/isar_connect_api.dart
Normal file
@ -0,0 +1,215 @@
|
|||||||
|
// coverage:ignore-file
|
||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
enum ConnectAction {
|
||||||
|
getSchema('ext.isar.getSchema'),
|
||||||
|
listInstances('ext.isar.listInstances'),
|
||||||
|
watchInstance('ext.isar.watchInstance'),
|
||||||
|
executeQuery('ext.isar.executeQuery'),
|
||||||
|
removeQuery('ext.isar.removeQuery'),
|
||||||
|
importJson('ext.isar.importJson'),
|
||||||
|
exportJson('ext.isar.exportJson'),
|
||||||
|
editProperty('ext.isar.editProperty');
|
||||||
|
|
||||||
|
const ConnectAction(this.method);
|
||||||
|
|
||||||
|
final String method;
|
||||||
|
}
|
||||||
|
|
||||||
|
enum ConnectEvent {
|
||||||
|
instancesChanged('isar.instancesChanged'),
|
||||||
|
queryChanged('isar.queryChanged'),
|
||||||
|
collectionInfoChanged('isar.collectionInfoChanged');
|
||||||
|
|
||||||
|
const ConnectEvent(this.event);
|
||||||
|
|
||||||
|
final String event;
|
||||||
|
}
|
||||||
|
|
||||||
|
class ConnectQuery {
|
||||||
|
ConnectQuery({
|
||||||
|
required this.instance,
|
||||||
|
required this.collection,
|
||||||
|
this.filter,
|
||||||
|
this.offset,
|
||||||
|
this.limit,
|
||||||
|
this.sortProperty,
|
||||||
|
this.sortAsc,
|
||||||
|
});
|
||||||
|
|
||||||
|
factory ConnectQuery.fromJson(Map<String, dynamic> json) {
|
||||||
|
return ConnectQuery(
|
||||||
|
instance: json['instance'] as String,
|
||||||
|
collection: json['collection'] as String,
|
||||||
|
filter: _filterFromJson(json['filter'] as Map<String, dynamic>?),
|
||||||
|
offset: json['offset'] as int?,
|
||||||
|
limit: json['limit'] as int?,
|
||||||
|
sortProperty: json['sortProperty'] as String?,
|
||||||
|
sortAsc: json['sortAsc'] as bool?,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
final String instance;
|
||||||
|
final String collection;
|
||||||
|
final FilterOperation? filter;
|
||||||
|
final int? offset;
|
||||||
|
final int? limit;
|
||||||
|
final String? sortProperty;
|
||||||
|
final bool? sortAsc;
|
||||||
|
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
return {
|
||||||
|
'instance': instance,
|
||||||
|
'collection': collection,
|
||||||
|
if (filter != null) 'filter': _filterToJson(filter!),
|
||||||
|
if (offset != null) 'offset': offset,
|
||||||
|
if (limit != null) 'limit': limit,
|
||||||
|
if (sortProperty != null) 'sortProperty': sortProperty,
|
||||||
|
if (sortAsc != null) 'sortAsc': sortAsc,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static FilterOperation? _filterFromJson(Map<String, dynamic>? json) {
|
||||||
|
if (json == null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
if (json.containsKey('filters')) {
|
||||||
|
final filters = (json['filters'] as List)
|
||||||
|
.map((e) => _filterFromJson(e as Map<String, dynamic>?)!)
|
||||||
|
.toList();
|
||||||
|
return FilterGroup(
|
||||||
|
type: FilterGroupType.values[json['type'] as int],
|
||||||
|
filters: filters,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
return FilterCondition(
|
||||||
|
type: FilterConditionType.values[json['type'] as int],
|
||||||
|
property: json['property'] as String,
|
||||||
|
value1: json['value1'],
|
||||||
|
value2: json['value2'],
|
||||||
|
include1: json['include1'] as bool,
|
||||||
|
include2: json['include2'] as bool,
|
||||||
|
caseSensitive: json['caseSensitive'] as bool,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
static Map<String, dynamic> _filterToJson(FilterOperation filter) {
|
||||||
|
if (filter is FilterCondition) {
|
||||||
|
return {
|
||||||
|
'type': filter.type.index,
|
||||||
|
'property': filter.property,
|
||||||
|
'value1': filter.value1,
|
||||||
|
'value2': filter.value2,
|
||||||
|
'include1': filter.include1,
|
||||||
|
'include2': filter.include2,
|
||||||
|
'caseSensitive': filter.caseSensitive,
|
||||||
|
};
|
||||||
|
} else if (filter is FilterGroup) {
|
||||||
|
return {
|
||||||
|
'type': filter.type.index,
|
||||||
|
'filters': filter.filters.map(_filterToJson).toList(),
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
throw UnimplementedError();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Query<dynamic> toQuery() {
|
||||||
|
final isar = Isar.getInstance(instance)!;
|
||||||
|
// ignore: invalid_use_of_protected_member
|
||||||
|
final collection = isar.getCollectionByNameInternal(this.collection)!;
|
||||||
|
WhereClause? whereClause;
|
||||||
|
var whereSort = Sort.asc;
|
||||||
|
|
||||||
|
SortProperty? sortProperty;
|
||||||
|
if (this.sortProperty != null) {
|
||||||
|
if (this.sortProperty == collection.schema.idName) {
|
||||||
|
whereClause = const IdWhereClause.any();
|
||||||
|
whereSort = sortAsc == true ? Sort.asc : Sort.desc;
|
||||||
|
} else {
|
||||||
|
sortProperty = SortProperty(
|
||||||
|
property: this.sortProperty!,
|
||||||
|
sort: sortAsc == true ? Sort.asc : Sort.desc,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return collection.buildQuery(
|
||||||
|
whereClauses: [if (whereClause != null) whereClause],
|
||||||
|
whereSort: whereSort,
|
||||||
|
filter: filter,
|
||||||
|
offset: offset,
|
||||||
|
limit: limit,
|
||||||
|
sortBy: [if (sortProperty != null) sortProperty],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class ConnectEdit {
|
||||||
|
ConnectEdit({
|
||||||
|
required this.instance,
|
||||||
|
required this.collection,
|
||||||
|
required this.id,
|
||||||
|
required this.path,
|
||||||
|
required this.value,
|
||||||
|
});
|
||||||
|
|
||||||
|
factory ConnectEdit.fromJson(Map<String, dynamic> json) {
|
||||||
|
return ConnectEdit(
|
||||||
|
instance: json['instance'] as String,
|
||||||
|
collection: json['collection'] as String,
|
||||||
|
id: json['id'] as Id,
|
||||||
|
path: json['path'] as String,
|
||||||
|
value: json['value'],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
final String instance;
|
||||||
|
final String collection;
|
||||||
|
final Id id;
|
||||||
|
final String path;
|
||||||
|
final dynamic value;
|
||||||
|
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
return {
|
||||||
|
'instance': instance,
|
||||||
|
'collection': collection,
|
||||||
|
'id': id,
|
||||||
|
'path': path,
|
||||||
|
'value': value,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class ConnectCollectionInfo {
|
||||||
|
ConnectCollectionInfo({
|
||||||
|
required this.instance,
|
||||||
|
required this.collection,
|
||||||
|
required this.size,
|
||||||
|
required this.count,
|
||||||
|
});
|
||||||
|
|
||||||
|
factory ConnectCollectionInfo.fromJson(Map<String, dynamic> json) {
|
||||||
|
return ConnectCollectionInfo(
|
||||||
|
instance: json['instance'] as String,
|
||||||
|
collection: json['collection'] as String,
|
||||||
|
size: json['size'] as int,
|
||||||
|
count: json['count'] as int,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
final String instance;
|
||||||
|
final String collection;
|
||||||
|
final int size;
|
||||||
|
final int count;
|
||||||
|
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
return {
|
||||||
|
'instance': instance,
|
||||||
|
'collection': collection,
|
||||||
|
'size': size,
|
||||||
|
'count': count,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
23
lib/src/isar_error.dart
Normal file
23
lib/src/isar_error.dart
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// An error raised by Isar.
|
||||||
|
class IsarError extends Error {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
IsarError(this.message);
|
||||||
|
|
||||||
|
/// The message
|
||||||
|
final String message;
|
||||||
|
|
||||||
|
@override
|
||||||
|
String toString() {
|
||||||
|
return 'IsarError: $message';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// This error is returned when a unique index constraint is violated.
|
||||||
|
class IsarUniqueViolationError extends IsarError {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
IsarUniqueViolationError() : super('Unique index violated');
|
||||||
|
}
|
113
lib/src/isar_link.dart
Normal file
113
lib/src/isar_link.dart
Normal file
@ -0,0 +1,113 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@sealed
|
||||||
|
abstract class IsarLinkBase<OBJ> {
|
||||||
|
/// Is the containing object managed by Isar?
|
||||||
|
bool get isAttached;
|
||||||
|
|
||||||
|
/// Have the contents been changed? If not, `.save()` is a no-op.
|
||||||
|
bool get isChanged;
|
||||||
|
|
||||||
|
/// Has this link been loaded?
|
||||||
|
bool get isLoaded;
|
||||||
|
|
||||||
|
/// {@template link_load}
|
||||||
|
/// Loads the linked object(s) from the database
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> load();
|
||||||
|
|
||||||
|
/// {@macro link_load}
|
||||||
|
void loadSync();
|
||||||
|
|
||||||
|
/// {@template link_save}
|
||||||
|
/// Saves the linked object(s) to the database if there are changes.
|
||||||
|
///
|
||||||
|
/// Also puts new objects into the database that have id set to `null` or
|
||||||
|
/// `Isar.autoIncrement`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> save();
|
||||||
|
|
||||||
|
/// {@macro link_save}
|
||||||
|
void saveSync();
|
||||||
|
|
||||||
|
/// {@template link_reset}
|
||||||
|
/// Unlinks all linked object(s).
|
||||||
|
///
|
||||||
|
/// You can even call this method on links that have not been loaded yet.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> reset();
|
||||||
|
|
||||||
|
/// {@macro link_reset}
|
||||||
|
void resetSync();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
void attach(
|
||||||
|
IsarCollection<dynamic> sourceCollection,
|
||||||
|
IsarCollection<OBJ> targetCollection,
|
||||||
|
String linkName,
|
||||||
|
Id? objectId,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Establishes a 1:1 relationship with the same or another collection. The
|
||||||
|
/// target collection is specified by the generic type argument.
|
||||||
|
abstract class IsarLink<OBJ> implements IsarLinkBase<OBJ> {
|
||||||
|
/// Create an empty, unattached link. Make sure to provide the correct
|
||||||
|
/// generic argument.
|
||||||
|
factory IsarLink() => IsarLinkImpl();
|
||||||
|
|
||||||
|
/// The linked object or `null` if no object is linked.
|
||||||
|
OBJ? get value;
|
||||||
|
|
||||||
|
/// The linked object or `null` if no object is linked.
|
||||||
|
set value(OBJ? obj);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Establishes a 1:n relationship with the same or another collection. The
|
||||||
|
/// target collection is specified by the generic type argument.
|
||||||
|
abstract class IsarLinks<OBJ> implements IsarLinkBase<OBJ>, Set<OBJ> {
|
||||||
|
/// Create an empty, unattached link. Make sure to provide the correct
|
||||||
|
/// generic argument.
|
||||||
|
factory IsarLinks() => IsarLinksImpl();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> load({bool overrideChanges = true});
|
||||||
|
|
||||||
|
@override
|
||||||
|
void loadSync({bool overrideChanges = true});
|
||||||
|
|
||||||
|
/// {@template links_update}
|
||||||
|
/// Creates and removes the specified links in the database.
|
||||||
|
///
|
||||||
|
/// This operation does not alter the state of the local copy of this link
|
||||||
|
/// and it can even be used without loading the link.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<void> update({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// {@macro links_update}
|
||||||
|
void updateSync({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// Starts a query for linked objects.
|
||||||
|
QueryBuilder<OBJ, OBJ, QAfterFilterCondition> filter();
|
||||||
|
|
||||||
|
/// {@template links_count}
|
||||||
|
/// Counts the linked objects in the database.
|
||||||
|
///
|
||||||
|
/// It does not take the local state into account and can even be used
|
||||||
|
/// without loading the link.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<int> count() => filter().count();
|
||||||
|
|
||||||
|
/// {@macro links_count}
|
||||||
|
int countSync() => filter().countSync();
|
||||||
|
}
|
88
lib/src/isar_reader.dart
Normal file
88
lib/src/isar_reader.dart
Normal file
@ -0,0 +1,88 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
abstract class IsarReader {
|
||||||
|
bool readBool(int offset);
|
||||||
|
|
||||||
|
bool? readBoolOrNull(int offset);
|
||||||
|
|
||||||
|
int readByte(int offset);
|
||||||
|
|
||||||
|
int? readByteOrNull(int offset);
|
||||||
|
|
||||||
|
int readInt(int offset);
|
||||||
|
|
||||||
|
int? readIntOrNull(int offset);
|
||||||
|
|
||||||
|
double readFloat(int offset);
|
||||||
|
|
||||||
|
double? readFloatOrNull(int offset);
|
||||||
|
|
||||||
|
int readLong(int offset);
|
||||||
|
|
||||||
|
int? readLongOrNull(int offset);
|
||||||
|
|
||||||
|
double readDouble(int offset);
|
||||||
|
|
||||||
|
double? readDoubleOrNull(int offset);
|
||||||
|
|
||||||
|
DateTime readDateTime(int offset);
|
||||||
|
|
||||||
|
DateTime? readDateTimeOrNull(int offset);
|
||||||
|
|
||||||
|
String readString(int offset);
|
||||||
|
|
||||||
|
String? readStringOrNull(int offset);
|
||||||
|
|
||||||
|
T? readObjectOrNull<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
);
|
||||||
|
|
||||||
|
List<bool>? readBoolList(int offset);
|
||||||
|
|
||||||
|
List<bool?>? readBoolOrNullList(int offset);
|
||||||
|
|
||||||
|
List<int>? readByteList(int offset);
|
||||||
|
|
||||||
|
List<int>? readIntList(int offset);
|
||||||
|
|
||||||
|
List<int?>? readIntOrNullList(int offset);
|
||||||
|
|
||||||
|
List<double>? readFloatList(int offset);
|
||||||
|
|
||||||
|
List<double?>? readFloatOrNullList(int offset);
|
||||||
|
|
||||||
|
List<int>? readLongList(int offset);
|
||||||
|
|
||||||
|
List<int?>? readLongOrNullList(int offset);
|
||||||
|
|
||||||
|
List<double>? readDoubleList(int offset);
|
||||||
|
|
||||||
|
List<double?>? readDoubleOrNullList(int offset);
|
||||||
|
|
||||||
|
List<DateTime>? readDateTimeList(int offset);
|
||||||
|
|
||||||
|
List<DateTime?>? readDateTimeOrNullList(int offset);
|
||||||
|
|
||||||
|
List<String>? readStringList(int offset);
|
||||||
|
|
||||||
|
List<String?>? readStringOrNullList(int offset);
|
||||||
|
|
||||||
|
List<T>? readObjectList<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
T defaultValue,
|
||||||
|
);
|
||||||
|
|
||||||
|
List<T?>? readObjectOrNullList<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
);
|
||||||
|
}
|
53
lib/src/isar_writer.dart
Normal file
53
lib/src/isar_writer.dart
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
abstract class IsarWriter {
|
||||||
|
void writeBool(int offset, bool? value);
|
||||||
|
|
||||||
|
void writeByte(int offset, int value);
|
||||||
|
|
||||||
|
void writeInt(int offset, int? value);
|
||||||
|
|
||||||
|
void writeFloat(int offset, double? value);
|
||||||
|
|
||||||
|
void writeLong(int offset, int? value);
|
||||||
|
|
||||||
|
void writeDouble(int offset, double? value);
|
||||||
|
|
||||||
|
void writeDateTime(int offset, DateTime? value);
|
||||||
|
|
||||||
|
void writeString(int offset, String? value);
|
||||||
|
|
||||||
|
void writeObject<T>(
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
Serialize<T> serialize,
|
||||||
|
T? value,
|
||||||
|
);
|
||||||
|
|
||||||
|
void writeByteList(int offset, List<int>? values);
|
||||||
|
|
||||||
|
void writeBoolList(int offset, List<bool?>? values);
|
||||||
|
|
||||||
|
void writeIntList(int offset, List<int?>? values);
|
||||||
|
|
||||||
|
void writeFloatList(int offset, List<double?>? values);
|
||||||
|
|
||||||
|
void writeLongList(int offset, List<int?>? values);
|
||||||
|
|
||||||
|
void writeDoubleList(int offset, List<double?>? values);
|
||||||
|
|
||||||
|
void writeDateTimeList(int offset, List<DateTime?>? values);
|
||||||
|
|
||||||
|
void writeStringList(int offset, List<String?>? values);
|
||||||
|
|
||||||
|
void writeObjectList<T>(
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
Serialize<T> serialize,
|
||||||
|
List<T?>? values,
|
||||||
|
);
|
||||||
|
}
|
2241
lib/src/native/bindings.dart
Normal file
2241
lib/src/native/bindings.dart
Normal file
File diff suppressed because it is too large
Load Diff
54
lib/src/native/encode_string.dart
Normal file
54
lib/src/native/encode_string.dart
Normal file
@ -0,0 +1,54 @@
|
|||||||
|
import 'dart:ffi';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
const int _oneByteLimit = 0x7f; // 7 bits
|
||||||
|
const int _twoByteLimit = 0x7ff; // 11 bits
|
||||||
|
const int _surrogateTagMask = 0xFC00;
|
||||||
|
const int _surrogateValueMask = 0x3FF;
|
||||||
|
const int _leadSurrogateMin = 0xD800;
|
||||||
|
|
||||||
|
/// Encodes a Dart String to UTF8, writes it at [offset] into [buffer] and
|
||||||
|
/// returns the number of written bytes.
|
||||||
|
///
|
||||||
|
/// The buffer needs to have a capacity of at least `offset + str.length * 3`.
|
||||||
|
int encodeString(String str, Uint8List buffer, int offset) {
|
||||||
|
final startOffset = offset;
|
||||||
|
for (var stringIndex = 0; stringIndex < str.length; stringIndex++) {
|
||||||
|
final codeUnit = str.codeUnitAt(stringIndex);
|
||||||
|
// ASCII has the same representation in UTF-8 and UTF-16.
|
||||||
|
if (codeUnit <= _oneByteLimit) {
|
||||||
|
buffer[offset++] = codeUnit;
|
||||||
|
} else if ((codeUnit & _surrogateTagMask) == _leadSurrogateMin) {
|
||||||
|
// combine surrogate pair
|
||||||
|
final nextCodeUnit = str.codeUnitAt(++stringIndex);
|
||||||
|
final rune = 0x10000 + ((codeUnit & _surrogateValueMask) << 10) |
|
||||||
|
(nextCodeUnit & _surrogateValueMask);
|
||||||
|
// If the rune is encoded with 2 code-units then it must be encoded
|
||||||
|
// with 4 bytes in UTF-8.
|
||||||
|
buffer[offset++] = 0xF0 | (rune >> 18);
|
||||||
|
buffer[offset++] = 0x80 | ((rune >> 12) & 0x3f);
|
||||||
|
buffer[offset++] = 0x80 | ((rune >> 6) & 0x3f);
|
||||||
|
buffer[offset++] = 0x80 | (rune & 0x3f);
|
||||||
|
} else if (codeUnit <= _twoByteLimit) {
|
||||||
|
buffer[offset++] = 0xC0 | (codeUnit >> 6);
|
||||||
|
buffer[offset++] = 0x80 | (codeUnit & 0x3f);
|
||||||
|
} else {
|
||||||
|
buffer[offset++] = 0xE0 | (codeUnit >> 12);
|
||||||
|
buffer[offset++] = 0x80 | ((codeUnit >> 6) & 0x3f);
|
||||||
|
buffer[offset++] = 0x80 | (codeUnit & 0x3f);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return offset - startOffset;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
extension CString on String {
|
||||||
|
/// Create a zero terminated C-String from a Dart String
|
||||||
|
Pointer<Char> toCString(Allocator alloc) {
|
||||||
|
final bufferPtr = alloc<Uint8>(length * 3 + 1);
|
||||||
|
final buffer = bufferPtr.asTypedList(length * 3 + 1);
|
||||||
|
final size = encodeString(this, buffer, 0);
|
||||||
|
buffer[size] = 0;
|
||||||
|
return bufferPtr.cast();
|
||||||
|
}
|
||||||
|
}
|
257
lib/src/native/index_key.dart
Normal file
257
lib/src/native/index_key.dart
Normal file
@ -0,0 +1,257 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:ffi';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/isar_writer_impl.dart';
|
||||||
|
|
||||||
|
final _keyPtrPtr = malloc<Pointer<CIndexKey>>();
|
||||||
|
|
||||||
|
Pointer<CIndexKey> buildIndexKey(
|
||||||
|
CollectionSchema<dynamic> schema,
|
||||||
|
IndexSchema index,
|
||||||
|
IndexKey key,
|
||||||
|
) {
|
||||||
|
if (key.length > index.properties.length) {
|
||||||
|
throw IsarError('Invalid number of values for index ${index.name}.');
|
||||||
|
}
|
||||||
|
|
||||||
|
IC.isar_key_create(_keyPtrPtr);
|
||||||
|
final keyPtr = _keyPtrPtr.value;
|
||||||
|
|
||||||
|
for (var i = 0; i < key.length; i++) {
|
||||||
|
final indexProperty = index.properties[i];
|
||||||
|
_addKeyValue(
|
||||||
|
keyPtr,
|
||||||
|
key[i],
|
||||||
|
schema.property(indexProperty.name),
|
||||||
|
indexProperty.type,
|
||||||
|
indexProperty.caseSensitive,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return keyPtr;
|
||||||
|
}
|
||||||
|
|
||||||
|
Pointer<CIndexKey> buildLowerUnboundedIndexKey() {
|
||||||
|
IC.isar_key_create(_keyPtrPtr);
|
||||||
|
return _keyPtrPtr.value;
|
||||||
|
}
|
||||||
|
|
||||||
|
Pointer<CIndexKey> buildUpperUnboundedIndexKey() {
|
||||||
|
IC.isar_key_create(_keyPtrPtr);
|
||||||
|
final keyPtr = _keyPtrPtr.value;
|
||||||
|
IC.isar_key_add_long(keyPtr, maxLong);
|
||||||
|
|
||||||
|
return keyPtr;
|
||||||
|
}
|
||||||
|
|
||||||
|
void _addKeyValue(
|
||||||
|
Pointer<CIndexKey> keyPtr,
|
||||||
|
Object? value,
|
||||||
|
PropertySchema property,
|
||||||
|
IndexType type,
|
||||||
|
bool caseSensitive,
|
||||||
|
) {
|
||||||
|
if (property.enumMap != null) {
|
||||||
|
if (value is Enum) {
|
||||||
|
value = property.enumMap![value.name];
|
||||||
|
} else if (value is List) {
|
||||||
|
value = value.map((e) {
|
||||||
|
if (e is Enum) {
|
||||||
|
return property.enumMap![e.name];
|
||||||
|
} else {
|
||||||
|
return e;
|
||||||
|
}
|
||||||
|
}).toList();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final isarType =
|
||||||
|
type != IndexType.hash ? property.type.scalarType : property.type;
|
||||||
|
switch (isarType) {
|
||||||
|
case IsarType.bool:
|
||||||
|
IC.isar_key_add_byte(keyPtr, (value as bool?).byteValue);
|
||||||
|
break;
|
||||||
|
case IsarType.byte:
|
||||||
|
IC.isar_key_add_byte(keyPtr, (value ?? 0) as int);
|
||||||
|
break;
|
||||||
|
case IsarType.int:
|
||||||
|
IC.isar_key_add_int(keyPtr, (value as int?) ?? nullInt);
|
||||||
|
break;
|
||||||
|
case IsarType.float:
|
||||||
|
IC.isar_key_add_float(keyPtr, (value as double?) ?? nullFloat);
|
||||||
|
break;
|
||||||
|
case IsarType.long:
|
||||||
|
IC.isar_key_add_long(keyPtr, (value as int?) ?? nullLong);
|
||||||
|
break;
|
||||||
|
case IsarType.double:
|
||||||
|
IC.isar_key_add_double(keyPtr, (value as double?) ?? nullDouble);
|
||||||
|
break;
|
||||||
|
case IsarType.dateTime:
|
||||||
|
IC.isar_key_add_long(keyPtr, (value as DateTime?).longValue);
|
||||||
|
break;
|
||||||
|
case IsarType.string:
|
||||||
|
final strPtr = _strToNative(value as String?);
|
||||||
|
if (type == IndexType.value) {
|
||||||
|
IC.isar_key_add_string(keyPtr, strPtr, caseSensitive);
|
||||||
|
} else {
|
||||||
|
IC.isar_key_add_string_hash(keyPtr, strPtr, caseSensitive);
|
||||||
|
}
|
||||||
|
_freeStr(strPtr);
|
||||||
|
break;
|
||||||
|
case IsarType.boolList:
|
||||||
|
if (value == null) {
|
||||||
|
IC.isar_key_add_byte_list_hash(keyPtr, nullptr, 0);
|
||||||
|
} else {
|
||||||
|
value as List<bool?>;
|
||||||
|
final boolListPtr = malloc<Uint8>(value.length);
|
||||||
|
boolListPtr
|
||||||
|
.asTypedList(value.length)
|
||||||
|
.setAll(0, value.map((e) => e.byteValue));
|
||||||
|
IC.isar_key_add_byte_list_hash(keyPtr, boolListPtr, value.length);
|
||||||
|
malloc.free(boolListPtr);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case IsarType.byteList:
|
||||||
|
if (value == null) {
|
||||||
|
IC.isar_key_add_byte_list_hash(keyPtr, nullptr, 0);
|
||||||
|
} else {
|
||||||
|
value as List<int>;
|
||||||
|
final bytesPtr = malloc<Uint8>(value.length);
|
||||||
|
bytesPtr.asTypedList(value.length).setAll(0, value);
|
||||||
|
IC.isar_key_add_byte_list_hash(keyPtr, bytesPtr, value.length);
|
||||||
|
malloc.free(bytesPtr);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case IsarType.intList:
|
||||||
|
if (value == null) {
|
||||||
|
IC.isar_key_add_int_list_hash(keyPtr, nullptr, 0);
|
||||||
|
} else {
|
||||||
|
value as List<int?>;
|
||||||
|
final intListPtr = malloc<Int32>(value.length);
|
||||||
|
intListPtr
|
||||||
|
.asTypedList(value.length)
|
||||||
|
.setAll(0, value.map((e) => e ?? nullInt));
|
||||||
|
IC.isar_key_add_int_list_hash(keyPtr, intListPtr, value.length);
|
||||||
|
malloc.free(intListPtr);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case IsarType.longList:
|
||||||
|
if (value == null) {
|
||||||
|
IC.isar_key_add_long_list_hash(keyPtr, nullptr, 0);
|
||||||
|
} else {
|
||||||
|
value as List<int?>;
|
||||||
|
final longListPtr = malloc<Int64>(value.length);
|
||||||
|
longListPtr
|
||||||
|
.asTypedList(value.length)
|
||||||
|
.setAll(0, value.map((e) => e ?? nullLong));
|
||||||
|
IC.isar_key_add_long_list_hash(keyPtr, longListPtr, value.length);
|
||||||
|
malloc.free(longListPtr);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case IsarType.dateTimeList:
|
||||||
|
if (value == null) {
|
||||||
|
IC.isar_key_add_long_list_hash(keyPtr, nullptr, 0);
|
||||||
|
} else {
|
||||||
|
value as List<DateTime?>;
|
||||||
|
final longListPtr = malloc<Int64>(value.length);
|
||||||
|
for (var i = 0; i < value.length; i++) {
|
||||||
|
longListPtr[i] = value[i].longValue;
|
||||||
|
}
|
||||||
|
IC.isar_key_add_long_list_hash(keyPtr, longListPtr, value.length);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case IsarType.stringList:
|
||||||
|
if (value == null) {
|
||||||
|
IC.isar_key_add_string_list_hash(keyPtr, nullptr, 0, false);
|
||||||
|
} else {
|
||||||
|
value as List<String?>;
|
||||||
|
final stringListPtr = malloc<Pointer<Char>>(value.length);
|
||||||
|
for (var i = 0; i < value.length; i++) {
|
||||||
|
stringListPtr[i] = _strToNative(value[i]);
|
||||||
|
}
|
||||||
|
IC.isar_key_add_string_list_hash(
|
||||||
|
keyPtr,
|
||||||
|
stringListPtr,
|
||||||
|
value.length,
|
||||||
|
caseSensitive,
|
||||||
|
);
|
||||||
|
for (var i = 0; i < value.length; i++) {
|
||||||
|
_freeStr(stringListPtr[i]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case IsarType.object:
|
||||||
|
case IsarType.floatList:
|
||||||
|
case IsarType.doubleList:
|
||||||
|
case IsarType.objectList:
|
||||||
|
throw IsarError('Unsupported property type.');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Pointer<Char> _strToNative(String? str) {
|
||||||
|
if (str == null) {
|
||||||
|
return Pointer.fromAddress(0);
|
||||||
|
} else {
|
||||||
|
return str.toCString(malloc);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _freeStr(Pointer<Char> strPtr) {
|
||||||
|
if (!strPtr.isNull) {
|
||||||
|
malloc.free(strPtr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
double? adjustFloatBound({
|
||||||
|
required double? value,
|
||||||
|
required bool lowerBound,
|
||||||
|
required bool include,
|
||||||
|
required double epsilon,
|
||||||
|
}) {
|
||||||
|
value ??= double.nan;
|
||||||
|
|
||||||
|
if (lowerBound) {
|
||||||
|
if (include) {
|
||||||
|
if (value.isFinite) {
|
||||||
|
return value - epsilon;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (value.isNaN) {
|
||||||
|
return double.negativeInfinity;
|
||||||
|
} else if (value == double.negativeInfinity) {
|
||||||
|
return -double.maxFinite;
|
||||||
|
} else if (value == double.maxFinite) {
|
||||||
|
return double.infinity;
|
||||||
|
} else if (value == double.infinity) {
|
||||||
|
return null;
|
||||||
|
} else {
|
||||||
|
return value + epsilon;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (include) {
|
||||||
|
if (value.isFinite) {
|
||||||
|
return value + epsilon;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (value.isNaN) {
|
||||||
|
return null;
|
||||||
|
} else if (value == double.negativeInfinity) {
|
||||||
|
return double.nan;
|
||||||
|
} else if (value == -double.maxFinite) {
|
||||||
|
return double.negativeInfinity;
|
||||||
|
} else if (value == double.infinity) {
|
||||||
|
return double.maxFinite;
|
||||||
|
} else {
|
||||||
|
return value - epsilon;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
649
lib/src/native/isar_collection_impl.dart
Normal file
649
lib/src/native/isar_collection_impl.dart
Normal file
@ -0,0 +1,649 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs, invalid_use_of_protected_member
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:ffi';
|
||||||
|
import 'dart:isolate';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/index_key.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/isar_impl.dart';
|
||||||
|
import 'package:isar/src/native/isar_reader_impl.dart';
|
||||||
|
import 'package:isar/src/native/isar_writer_impl.dart';
|
||||||
|
import 'package:isar/src/native/query_build.dart';
|
||||||
|
import 'package:isar/src/native/txn.dart';
|
||||||
|
|
||||||
|
class IsarCollectionImpl<OBJ> extends IsarCollection<OBJ> {
|
||||||
|
IsarCollectionImpl({
|
||||||
|
required this.isar,
|
||||||
|
required this.ptr,
|
||||||
|
required this.schema,
|
||||||
|
});
|
||||||
|
|
||||||
|
@override
|
||||||
|
final IsarImpl isar;
|
||||||
|
final Pointer<CIsarCollection> ptr;
|
||||||
|
|
||||||
|
@override
|
||||||
|
final CollectionSchema<OBJ> schema;
|
||||||
|
|
||||||
|
late final _offsets = isar.offsets[OBJ]!;
|
||||||
|
late final _staticSize = _offsets.last;
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
OBJ deserializeObject(CObject cObj) {
|
||||||
|
final buffer = cObj.buffer.asTypedList(cObj.buffer_length);
|
||||||
|
final reader = IsarReaderImpl(buffer);
|
||||||
|
final object = schema.deserialize(
|
||||||
|
cObj.id,
|
||||||
|
reader,
|
||||||
|
_offsets,
|
||||||
|
isar.offsets,
|
||||||
|
);
|
||||||
|
schema.attach(this, cObj.id, object);
|
||||||
|
return object;
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
OBJ? deserializeObjectOrNull(CObject cObj) {
|
||||||
|
if (!cObj.buffer.isNull) {
|
||||||
|
return deserializeObject(cObj);
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
List<OBJ> deserializeObjects(CObjectSet objectSet) {
|
||||||
|
final objects = <OBJ>[];
|
||||||
|
for (var i = 0; i < objectSet.length; i++) {
|
||||||
|
final cObjPtr = objectSet.objects.elementAt(i);
|
||||||
|
final object = deserializeObject(cObjPtr.ref);
|
||||||
|
objects.add(object);
|
||||||
|
}
|
||||||
|
return objects;
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
List<OBJ?> deserializeObjectsOrNull(CObjectSet objectSet) {
|
||||||
|
final objects = List<OBJ?>.filled(objectSet.length, null);
|
||||||
|
for (var i = 0; i < objectSet.length; i++) {
|
||||||
|
final cObj = objectSet.objects.elementAt(i).ref;
|
||||||
|
if (!cObj.buffer.isNull) {
|
||||||
|
objects[i] = deserializeObject(cObj);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return objects;
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
Pointer<Pointer<CIndexKey>> _getKeysPtr(
|
||||||
|
String indexName,
|
||||||
|
List<IndexKey> keys,
|
||||||
|
Allocator alloc,
|
||||||
|
) {
|
||||||
|
final keysPtrPtr = alloc<Pointer<CIndexKey>>(keys.length);
|
||||||
|
for (var i = 0; i < keys.length; i++) {
|
||||||
|
keysPtrPtr[i] = buildIndexKey(schema, schema.index(indexName), keys[i]);
|
||||||
|
}
|
||||||
|
return keysPtrPtr;
|
||||||
|
}
|
||||||
|
|
||||||
|
List<T> deserializeProperty<T>(CObjectSet objectSet, int? propertyId) {
|
||||||
|
final values = <T>[];
|
||||||
|
if (propertyId != null) {
|
||||||
|
final propertyOffset = _offsets[propertyId];
|
||||||
|
for (var i = 0; i < objectSet.length; i++) {
|
||||||
|
final cObj = objectSet.objects.elementAt(i).ref;
|
||||||
|
final buffer = cObj.buffer.asTypedList(cObj.buffer_length);
|
||||||
|
values.add(
|
||||||
|
schema.deserializeProp(
|
||||||
|
IsarReaderImpl(buffer),
|
||||||
|
propertyId,
|
||||||
|
propertyOffset,
|
||||||
|
isar.offsets,
|
||||||
|
) as T,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
for (var i = 0; i < objectSet.length; i++) {
|
||||||
|
final cObj = objectSet.objects.elementAt(i).ref;
|
||||||
|
values.add(cObj.id as T);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return values;
|
||||||
|
}
|
||||||
|
|
||||||
|
void serializeObjects(
|
||||||
|
Txn txn,
|
||||||
|
Pointer<CObject> objectsPtr,
|
||||||
|
List<OBJ> objects,
|
||||||
|
) {
|
||||||
|
var maxBufferSize = 0;
|
||||||
|
for (var i = 0; i < objects.length; i++) {
|
||||||
|
final object = objects[i];
|
||||||
|
maxBufferSize += schema.estimateSize(object, _offsets, isar.offsets);
|
||||||
|
}
|
||||||
|
final bufferPtr = txn.alloc<Uint8>(maxBufferSize);
|
||||||
|
final buffer = bufferPtr.asTypedList(maxBufferSize).buffer;
|
||||||
|
|
||||||
|
var writtenBytes = 0;
|
||||||
|
for (var i = 0; i < objects.length; i++) {
|
||||||
|
final objBuffer = buffer.asUint8List(writtenBytes);
|
||||||
|
final binaryWriter = IsarWriterImpl(objBuffer, _staticSize);
|
||||||
|
|
||||||
|
final object = objects[i];
|
||||||
|
schema.serialize(
|
||||||
|
object,
|
||||||
|
binaryWriter,
|
||||||
|
_offsets,
|
||||||
|
isar.offsets,
|
||||||
|
);
|
||||||
|
final size = binaryWriter.usedBytes;
|
||||||
|
|
||||||
|
final cObj = objectsPtr.elementAt(i).ref;
|
||||||
|
cObj.id = schema.getId(object);
|
||||||
|
cObj.buffer = bufferPtr.elementAt(writtenBytes);
|
||||||
|
cObj.buffer_length = size;
|
||||||
|
|
||||||
|
writtenBytes += size;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<OBJ?>> getAll(List<int> ids) {
|
||||||
|
return isar.getTxn(false, (Txn txn) async {
|
||||||
|
final cObjSetPtr = txn.newCObjectSet(ids.length);
|
||||||
|
final objectsPtr = cObjSetPtr.ref.objects;
|
||||||
|
for (var i = 0; i < ids.length; i++) {
|
||||||
|
objectsPtr.elementAt(i).ref.id = ids[i];
|
||||||
|
}
|
||||||
|
IC.isar_get_all(ptr, txn.ptr, cObjSetPtr);
|
||||||
|
await txn.wait();
|
||||||
|
return deserializeObjectsOrNull(cObjSetPtr.ref);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<OBJ?> getAllSync(List<int> ids) {
|
||||||
|
return isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final cObjPtr = txn.getCObject();
|
||||||
|
final cObj = cObjPtr.ref;
|
||||||
|
|
||||||
|
final objects = List<OBJ?>.filled(ids.length, null);
|
||||||
|
for (var i = 0; i < ids.length; i++) {
|
||||||
|
cObj.id = ids[i];
|
||||||
|
nCall(IC.isar_get(ptr, txn.ptr, cObjPtr));
|
||||||
|
objects[i] = deserializeObjectOrNull(cObj);
|
||||||
|
}
|
||||||
|
|
||||||
|
return objects;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<OBJ?>> getAllByIndex(String indexName, List<IndexKey> keys) {
|
||||||
|
return isar.getTxn(false, (Txn txn) async {
|
||||||
|
final cObjSetPtr = txn.newCObjectSet(keys.length);
|
||||||
|
final keysPtrPtr = _getKeysPtr(indexName, keys, txn.alloc);
|
||||||
|
IC.isar_get_all_by_index(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
schema.index(indexName).id,
|
||||||
|
keysPtrPtr,
|
||||||
|
cObjSetPtr,
|
||||||
|
);
|
||||||
|
await txn.wait();
|
||||||
|
return deserializeObjectsOrNull(cObjSetPtr.ref);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<OBJ?> getAllByIndexSync(String indexName, List<IndexKey> keys) {
|
||||||
|
final index = schema.index(indexName);
|
||||||
|
|
||||||
|
return isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final cObjPtr = txn.getCObject();
|
||||||
|
final cObj = cObjPtr.ref;
|
||||||
|
|
||||||
|
final objects = List<OBJ?>.filled(keys.length, null);
|
||||||
|
for (var i = 0; i < keys.length; i++) {
|
||||||
|
final keyPtr = buildIndexKey(schema, index, keys[i]);
|
||||||
|
nCall(IC.isar_get_by_index(ptr, txn.ptr, index.id, keyPtr, cObjPtr));
|
||||||
|
objects[i] = deserializeObjectOrNull(cObj);
|
||||||
|
}
|
||||||
|
|
||||||
|
return objects;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int putSync(OBJ object, {bool saveLinks = true}) {
|
||||||
|
return isar.getTxnSync(true, (Txn txn) {
|
||||||
|
return putByIndexSyncInternal(
|
||||||
|
txn: txn,
|
||||||
|
object: object,
|
||||||
|
saveLinks: saveLinks,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int putByIndexSync(String indexName, OBJ object, {bool saveLinks = true}) {
|
||||||
|
return isar.getTxnSync(true, (Txn txn) {
|
||||||
|
return putByIndexSyncInternal(
|
||||||
|
txn: txn,
|
||||||
|
object: object,
|
||||||
|
indexId: schema.index(indexName).id,
|
||||||
|
saveLinks: saveLinks,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
int putByIndexSyncInternal({
|
||||||
|
required Txn txn,
|
||||||
|
int? indexId,
|
||||||
|
required OBJ object,
|
||||||
|
bool saveLinks = true,
|
||||||
|
}) {
|
||||||
|
final cObjPtr = txn.getCObject();
|
||||||
|
final cObj = cObjPtr.ref;
|
||||||
|
|
||||||
|
final estimatedSize = schema.estimateSize(object, _offsets, isar.offsets);
|
||||||
|
cObj.buffer = txn.getBuffer(estimatedSize);
|
||||||
|
final buffer = cObj.buffer.asTypedList(estimatedSize);
|
||||||
|
|
||||||
|
final writer = IsarWriterImpl(buffer, _staticSize);
|
||||||
|
schema.serialize(
|
||||||
|
object,
|
||||||
|
writer,
|
||||||
|
_offsets,
|
||||||
|
isar.offsets,
|
||||||
|
);
|
||||||
|
cObj.buffer_length = writer.usedBytes;
|
||||||
|
|
||||||
|
cObj.id = schema.getId(object);
|
||||||
|
|
||||||
|
if (indexId != null) {
|
||||||
|
nCall(IC.isar_put_by_index(ptr, txn.ptr, indexId, cObjPtr));
|
||||||
|
} else {
|
||||||
|
nCall(IC.isar_put(ptr, txn.ptr, cObjPtr));
|
||||||
|
}
|
||||||
|
|
||||||
|
final id = cObj.id;
|
||||||
|
schema.attach(this, id, object);
|
||||||
|
|
||||||
|
if (saveLinks) {
|
||||||
|
for (final link in schema.getLinks(object)) {
|
||||||
|
link.saveSync();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<int>> putAll(List<OBJ> objects) {
|
||||||
|
return putAllByIndex(null, objects);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int> putAllSync(List<OBJ> objects, {bool saveLinks = true}) {
|
||||||
|
return putAllByIndexSync(null, objects, saveLinks: saveLinks);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<int>> putAllByIndex(String? indexName, List<OBJ> objects) {
|
||||||
|
final indexId = indexName != null ? schema.index(indexName).id : null;
|
||||||
|
|
||||||
|
return isar.getTxn(true, (Txn txn) async {
|
||||||
|
final cObjSetPtr = txn.newCObjectSet(objects.length);
|
||||||
|
serializeObjects(txn, cObjSetPtr.ref.objects, objects);
|
||||||
|
|
||||||
|
if (indexId != null) {
|
||||||
|
IC.isar_put_all_by_index(ptr, txn.ptr, indexId, cObjSetPtr);
|
||||||
|
} else {
|
||||||
|
IC.isar_put_all(ptr, txn.ptr, cObjSetPtr);
|
||||||
|
}
|
||||||
|
|
||||||
|
await txn.wait();
|
||||||
|
final cObjectSet = cObjSetPtr.ref;
|
||||||
|
final ids = List<int>.filled(objects.length, 0);
|
||||||
|
for (var i = 0; i < objects.length; i++) {
|
||||||
|
final cObjPtr = cObjectSet.objects.elementAt(i);
|
||||||
|
final id = cObjPtr.ref.id;
|
||||||
|
ids[i] = id;
|
||||||
|
|
||||||
|
final object = objects[i];
|
||||||
|
schema.attach(this, id, object);
|
||||||
|
}
|
||||||
|
return ids;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int> putAllByIndexSync(
|
||||||
|
String? indexName,
|
||||||
|
List<OBJ> objects, {
|
||||||
|
bool saveLinks = true,
|
||||||
|
}) {
|
||||||
|
final indexId = indexName != null ? schema.index(indexName).id : null;
|
||||||
|
final ids = List.filled(objects.length, 0);
|
||||||
|
isar.getTxnSync(true, (Txn txn) {
|
||||||
|
for (var i = 0; i < objects.length; i++) {
|
||||||
|
ids[i] = putByIndexSyncInternal(
|
||||||
|
txn: txn,
|
||||||
|
object: objects[i],
|
||||||
|
indexId: indexId,
|
||||||
|
saveLinks: saveLinks,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return ids;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> deleteAll(List<int> ids) {
|
||||||
|
return isar.getTxn(true, (Txn txn) async {
|
||||||
|
final countPtr = txn.alloc<Uint32>();
|
||||||
|
final idsPtr = txn.alloc<Int64>(ids.length);
|
||||||
|
idsPtr.asTypedList(ids.length).setAll(0, ids);
|
||||||
|
|
||||||
|
IC.isar_delete_all(ptr, txn.ptr, idsPtr, ids.length, countPtr);
|
||||||
|
await txn.wait();
|
||||||
|
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int deleteAllSync(List<int> ids) {
|
||||||
|
return isar.getTxnSync(true, (Txn txn) {
|
||||||
|
final deletedPtr = txn.alloc<Bool>();
|
||||||
|
|
||||||
|
var counter = 0;
|
||||||
|
for (var i = 0; i < ids.length; i++) {
|
||||||
|
nCall(IC.isar_delete(ptr, txn.ptr, ids[i], deletedPtr));
|
||||||
|
if (deletedPtr.value) {
|
||||||
|
counter++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return counter;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> deleteAllByIndex(String indexName, List<IndexKey> keys) {
|
||||||
|
return isar.getTxn(true, (Txn txn) async {
|
||||||
|
final countPtr = txn.alloc<Uint32>();
|
||||||
|
final keysPtrPtr = _getKeysPtr(indexName, keys, txn.alloc);
|
||||||
|
|
||||||
|
IC.isar_delete_all_by_index(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
schema.index(indexName).id,
|
||||||
|
keysPtrPtr,
|
||||||
|
keys.length,
|
||||||
|
countPtr,
|
||||||
|
);
|
||||||
|
await txn.wait();
|
||||||
|
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int deleteAllByIndexSync(String indexName, List<IndexKey> keys) {
|
||||||
|
return isar.getTxnSync(true, (Txn txn) {
|
||||||
|
final countPtr = txn.alloc<Uint32>();
|
||||||
|
final keysPtrPtr = _getKeysPtr(indexName, keys, txn.alloc);
|
||||||
|
|
||||||
|
nCall(
|
||||||
|
IC.isar_delete_all_by_index(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
schema.index(indexName).id,
|
||||||
|
keysPtrPtr,
|
||||||
|
keys.length,
|
||||||
|
countPtr,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> clear() {
|
||||||
|
return isar.getTxn(true, (Txn txn) async {
|
||||||
|
IC.isar_clear(ptr, txn.ptr);
|
||||||
|
await txn.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void clearSync() {
|
||||||
|
isar.getTxnSync(true, (Txn txn) {
|
||||||
|
nCall(IC.isar_clear(ptr, txn.ptr));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> importJson(List<Map<String, dynamic>> json) {
|
||||||
|
final bytes = const Utf8Encoder().convert(jsonEncode(json));
|
||||||
|
return importJsonRaw(bytes);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> importJsonRaw(Uint8List jsonBytes) {
|
||||||
|
return isar.getTxn(true, (Txn txn) async {
|
||||||
|
final bytesPtr = txn.alloc<Uint8>(jsonBytes.length);
|
||||||
|
bytesPtr.asTypedList(jsonBytes.length).setAll(0, jsonBytes);
|
||||||
|
final idNamePtr = schema.idName.toCString(txn.alloc);
|
||||||
|
|
||||||
|
IC.isar_json_import(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
idNamePtr,
|
||||||
|
bytesPtr,
|
||||||
|
jsonBytes.length,
|
||||||
|
);
|
||||||
|
await txn.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void importJsonSync(List<Map<String, dynamic>> json) {
|
||||||
|
final bytes = const Utf8Encoder().convert(jsonEncode(json));
|
||||||
|
importJsonRawSync(bytes);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void importJsonRawSync(Uint8List jsonBytes) {
|
||||||
|
return isar.getTxnSync(true, (Txn txn) async {
|
||||||
|
final bytesPtr = txn.getBuffer(jsonBytes.length);
|
||||||
|
bytesPtr.asTypedList(jsonBytes.length).setAll(0, jsonBytes);
|
||||||
|
final idNamePtr = schema.idName.toCString(txn.alloc);
|
||||||
|
|
||||||
|
nCall(
|
||||||
|
IC.isar_json_import(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
idNamePtr,
|
||||||
|
bytesPtr,
|
||||||
|
jsonBytes.length,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> count() {
|
||||||
|
return isar.getTxn(false, (Txn txn) async {
|
||||||
|
final countPtr = txn.alloc<Int64>();
|
||||||
|
IC.isar_count(ptr, txn.ptr, countPtr);
|
||||||
|
await txn.wait();
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int countSync() {
|
||||||
|
return isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final countPtr = txn.alloc<Int64>();
|
||||||
|
nCall(IC.isar_count(ptr, txn.ptr, countPtr));
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> getSize({
|
||||||
|
bool includeIndexes = false,
|
||||||
|
bool includeLinks = false,
|
||||||
|
}) {
|
||||||
|
return isar.getTxn(false, (Txn txn) async {
|
||||||
|
final sizePtr = txn.alloc<Int64>();
|
||||||
|
IC.isar_get_size(ptr, txn.ptr, includeIndexes, includeLinks, sizePtr);
|
||||||
|
await txn.wait();
|
||||||
|
return sizePtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int getSizeSync({bool includeIndexes = false, bool includeLinks = false}) {
|
||||||
|
return isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final sizePtr = txn.alloc<Int64>();
|
||||||
|
nCall(
|
||||||
|
IC.isar_get_size(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
includeIndexes,
|
||||||
|
includeLinks,
|
||||||
|
sizePtr,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
return sizePtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false}) {
|
||||||
|
isar.requireOpen();
|
||||||
|
final port = ReceivePort();
|
||||||
|
final handle =
|
||||||
|
IC.isar_watch_collection(isar.ptr, ptr, port.sendPort.nativePort);
|
||||||
|
final controller = StreamController<void>(
|
||||||
|
onCancel: () {
|
||||||
|
IC.isar_stop_watching(handle);
|
||||||
|
port.close();
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (fireImmediately) {
|
||||||
|
controller.add(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
controller.addStream(port);
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<OBJ?> watchObject(Id id, {bool fireImmediately = false}) {
|
||||||
|
return watchObjectLazy(id, fireImmediately: fireImmediately)
|
||||||
|
.asyncMap((event) => get(id));
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<void> watchObjectLazy(Id id, {bool fireImmediately = false}) {
|
||||||
|
isar.requireOpen();
|
||||||
|
final cObjPtr = malloc<CObject>();
|
||||||
|
|
||||||
|
final port = ReceivePort();
|
||||||
|
final handle =
|
||||||
|
IC.isar_watch_object(isar.ptr, ptr, id, port.sendPort.nativePort);
|
||||||
|
malloc.free(cObjPtr);
|
||||||
|
|
||||||
|
final controller = StreamController<void>(
|
||||||
|
onCancel: () {
|
||||||
|
IC.isar_stop_watching(handle);
|
||||||
|
port.close();
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (fireImmediately) {
|
||||||
|
controller.add(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
controller.addStream(port);
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Query<T> buildQuery<T>({
|
||||||
|
List<WhereClause> whereClauses = const [],
|
||||||
|
bool whereDistinct = false,
|
||||||
|
Sort whereSort = Sort.asc,
|
||||||
|
FilterOperation? filter,
|
||||||
|
List<SortProperty> sortBy = const [],
|
||||||
|
List<DistinctProperty> distinctBy = const [],
|
||||||
|
int? offset,
|
||||||
|
int? limit,
|
||||||
|
String? property,
|
||||||
|
}) {
|
||||||
|
isar.requireOpen();
|
||||||
|
return buildNativeQuery(
|
||||||
|
this,
|
||||||
|
whereClauses,
|
||||||
|
whereDistinct,
|
||||||
|
whereSort,
|
||||||
|
filter,
|
||||||
|
sortBy,
|
||||||
|
distinctBy,
|
||||||
|
offset,
|
||||||
|
limit,
|
||||||
|
property,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> verify(List<OBJ> objects) async {
|
||||||
|
await isar.verify();
|
||||||
|
return isar.getTxn(false, (Txn txn) async {
|
||||||
|
final cObjSetPtr = txn.newCObjectSet(objects.length);
|
||||||
|
serializeObjects(txn, cObjSetPtr.ref.objects, objects);
|
||||||
|
|
||||||
|
IC.isar_verify(ptr, txn.ptr, cObjSetPtr);
|
||||||
|
await txn.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> verifyLink(
|
||||||
|
String linkName,
|
||||||
|
List<int> sourceIds,
|
||||||
|
List<int> targetIds,
|
||||||
|
) async {
|
||||||
|
final link = schema.link(linkName);
|
||||||
|
|
||||||
|
return isar.getTxn(false, (Txn txn) async {
|
||||||
|
final idsPtr = txn.alloc<Int64>(sourceIds.length + targetIds.length);
|
||||||
|
for (var i = 0; i < sourceIds.length; i++) {
|
||||||
|
idsPtr[i * 2] = sourceIds[i];
|
||||||
|
idsPtr[i * 2 + 1] = targetIds[i];
|
||||||
|
}
|
||||||
|
|
||||||
|
IC.isar_link_verify(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
link.id,
|
||||||
|
idsPtr,
|
||||||
|
sourceIds.length + targetIds.length,
|
||||||
|
);
|
||||||
|
await txn.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
234
lib/src/native/isar_core.dart
Normal file
234
lib/src/native/isar_core.dart
Normal file
@ -0,0 +1,234 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:ffi';
|
||||||
|
import 'dart:io';
|
||||||
|
import 'dart:isolate';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
|
||||||
|
const Id isarMinId = -9223372036854775807;
|
||||||
|
|
||||||
|
const Id isarMaxId = 9223372036854775807;
|
||||||
|
|
||||||
|
const Id isarAutoIncrementId = -9223372036854775808;
|
||||||
|
|
||||||
|
typedef IsarAbi = Abi;
|
||||||
|
|
||||||
|
const int minByte = 0;
|
||||||
|
const int maxByte = 255;
|
||||||
|
const int minInt = -2147483648;
|
||||||
|
const int maxInt = 2147483647;
|
||||||
|
const int minLong = -9223372036854775808;
|
||||||
|
const int maxLong = 9223372036854775807;
|
||||||
|
const double minDouble = double.nan;
|
||||||
|
const double maxDouble = double.infinity;
|
||||||
|
|
||||||
|
const nullByte = IsarObject_NULL_BYTE;
|
||||||
|
const nullInt = IsarObject_NULL_INT;
|
||||||
|
const nullLong = IsarObject_NULL_LONG;
|
||||||
|
const nullFloat = double.nan;
|
||||||
|
const nullDouble = double.nan;
|
||||||
|
final nullDate = DateTime.fromMillisecondsSinceEpoch(0);
|
||||||
|
|
||||||
|
const nullBool = IsarObject_NULL_BOOL;
|
||||||
|
const falseBool = IsarObject_FALSE_BOOL;
|
||||||
|
const trueBool = IsarObject_TRUE_BOOL;
|
||||||
|
|
||||||
|
const String _githubUrl = 'https://github.com/isar/isar/releases/download';
|
||||||
|
|
||||||
|
bool _isarInitialized = false;
|
||||||
|
|
||||||
|
// ignore: non_constant_identifier_names
|
||||||
|
late final IsarCoreBindings IC;
|
||||||
|
|
||||||
|
typedef FinalizerFunction = void Function(Pointer<Void> token);
|
||||||
|
late final Pointer<NativeFinalizerFunction> isarClose;
|
||||||
|
late final Pointer<NativeFinalizerFunction> isarQueryFree;
|
||||||
|
|
||||||
|
FutureOr<void> initializeCoreBinary({
|
||||||
|
Map<Abi, String> libraries = const {},
|
||||||
|
bool download = false,
|
||||||
|
}) {
|
||||||
|
if (_isarInitialized) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
String? libraryPath;
|
||||||
|
if (!Platform.isIOS) {
|
||||||
|
libraryPath = libraries[Abi.current()] ?? Abi.current().localName;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
_initializePath(libraryPath);
|
||||||
|
} catch (e) {
|
||||||
|
if (!Platform.isAndroid && !Platform.isIOS) {
|
||||||
|
final downloadPath = _getLibraryDownloadPath(libraries);
|
||||||
|
if (download) {
|
||||||
|
return _downloadIsarCore(downloadPath).then((value) {
|
||||||
|
_initializePath(downloadPath);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
// try to use the binary at the download path anyway
|
||||||
|
_initializePath(downloadPath);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
throw IsarError(
|
||||||
|
'Could not initialize IsarCore library for processor architecture '
|
||||||
|
'"${Abi.current()}". If you create a Flutter app, make sure to add '
|
||||||
|
'isar_flutter_libs to your dependencies.\n$e',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _initializePath(String? libraryPath) {
|
||||||
|
late DynamicLibrary dylib;
|
||||||
|
if (Platform.isIOS) {
|
||||||
|
dylib = DynamicLibrary.process();
|
||||||
|
} else {
|
||||||
|
dylib = DynamicLibrary.open(libraryPath!);
|
||||||
|
}
|
||||||
|
|
||||||
|
final bindings = IsarCoreBindings(dylib);
|
||||||
|
|
||||||
|
final coreVersion = bindings.isar_version().cast<Utf8>().toDartString();
|
||||||
|
if (coreVersion != Isar.version && coreVersion != 'debug') {
|
||||||
|
throw IsarError(
|
||||||
|
'Incorrect Isar Core version: Required ${Isar.version} found '
|
||||||
|
'$coreVersion. Make sure to use the latest isar_flutter_libs. If you '
|
||||||
|
'have a Dart only project, make sure that old Isar Core binaries are '
|
||||||
|
'deleted.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
IC = bindings;
|
||||||
|
isarClose = dylib.lookup('isar_instance_close');
|
||||||
|
isarQueryFree = dylib.lookup('isar_q_free');
|
||||||
|
_isarInitialized = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
String _getLibraryDownloadPath(Map<Abi, String> libraries) {
|
||||||
|
final providedPath = libraries[Abi.current()];
|
||||||
|
if (providedPath != null) {
|
||||||
|
return providedPath;
|
||||||
|
} else {
|
||||||
|
final name = Abi.current().localName;
|
||||||
|
if (Platform.script.path.isEmpty) {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
var dir = Platform.script.pathSegments
|
||||||
|
.sublist(0, Platform.script.pathSegments.length - 1)
|
||||||
|
.join(Platform.pathSeparator);
|
||||||
|
if (!Platform.isWindows) {
|
||||||
|
// Not on windows, add leading platform path separator
|
||||||
|
dir = '${Platform.pathSeparator}$dir';
|
||||||
|
}
|
||||||
|
return '$dir${Platform.pathSeparator}$name';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<void> _downloadIsarCore(String libraryPath) async {
|
||||||
|
final libraryFile = File(libraryPath);
|
||||||
|
// ignore: avoid_slow_async_io
|
||||||
|
if (await libraryFile.exists()) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
final remoteName = Abi.current().remoteName;
|
||||||
|
final uri = Uri.parse('$_githubUrl/${Isar.version}/$remoteName');
|
||||||
|
final request = await HttpClient().getUrl(uri);
|
||||||
|
final response = await request.close();
|
||||||
|
if (response.statusCode != 200) {
|
||||||
|
throw IsarError(
|
||||||
|
'Could not download IsarCore library: ${response.reasonPhrase}',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
await response.pipe(libraryFile.openWrite());
|
||||||
|
}
|
||||||
|
|
||||||
|
IsarError? isarErrorFromResult(int result) {
|
||||||
|
if (result != 0) {
|
||||||
|
final error = IC.isar_get_error(result);
|
||||||
|
if (error.address == 0) {
|
||||||
|
throw IsarError(
|
||||||
|
'There was an error but it could not be loaded from IsarCore.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
final message = error.cast<Utf8>().toDartString();
|
||||||
|
return IsarError(message);
|
||||||
|
} finally {
|
||||||
|
IC.isar_free_string(error);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void nCall(int result) {
|
||||||
|
final error = isarErrorFromResult(result);
|
||||||
|
if (error != null) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Stream<void> wrapIsarPort(ReceivePort port) {
|
||||||
|
final portStreamController = StreamController<void>(onCancel: port.close);
|
||||||
|
port.listen((event) {
|
||||||
|
if (event == 0) {
|
||||||
|
portStreamController.add(null);
|
||||||
|
} else {
|
||||||
|
final error = isarErrorFromResult(event as int);
|
||||||
|
portStreamController.addError(error!);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return portStreamController.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
extension PointerX on Pointer {
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
bool get isNull => address == 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
extension on Abi {
|
||||||
|
String get localName {
|
||||||
|
switch (Abi.current()) {
|
||||||
|
case Abi.androidArm:
|
||||||
|
case Abi.androidArm64:
|
||||||
|
case Abi.androidIA32:
|
||||||
|
case Abi.androidX64:
|
||||||
|
return 'libisar.so';
|
||||||
|
case Abi.macosArm64:
|
||||||
|
case Abi.macosX64:
|
||||||
|
return 'libisar.dylib';
|
||||||
|
case Abi.linuxX64:
|
||||||
|
return 'libisar.so';
|
||||||
|
case Abi.windowsArm64:
|
||||||
|
case Abi.windowsX64:
|
||||||
|
return 'isar.dll';
|
||||||
|
default:
|
||||||
|
throw IsarError(
|
||||||
|
'Unsupported processor architecture "${Abi.current()}". '
|
||||||
|
'Please open an issue on GitHub to request it.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
String get remoteName {
|
||||||
|
switch (Abi.current()) {
|
||||||
|
case Abi.macosArm64:
|
||||||
|
case Abi.macosX64:
|
||||||
|
return 'libisar_macos.dylib';
|
||||||
|
case Abi.linuxX64:
|
||||||
|
return 'libisar_linux_x64.so';
|
||||||
|
case Abi.windowsArm64:
|
||||||
|
return 'isar_windows_arm64.dll';
|
||||||
|
case Abi.windowsX64:
|
||||||
|
return 'isar_windows_x64.dll';
|
||||||
|
}
|
||||||
|
throw UnimplementedError();
|
||||||
|
}
|
||||||
|
}
|
139
lib/src/native/isar_impl.dart
Normal file
139
lib/src/native/isar_impl.dart
Normal file
@ -0,0 +1,139 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:ffi';
|
||||||
|
import 'dart:isolate';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/src/common/isar_common.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/txn.dart';
|
||||||
|
|
||||||
|
class IsarImpl extends IsarCommon implements Finalizable {
|
||||||
|
IsarImpl(super.name, this.ptr) {
|
||||||
|
_finalizer = NativeFinalizer(isarClose);
|
||||||
|
_finalizer.attach(this, ptr.cast(), detach: this);
|
||||||
|
}
|
||||||
|
|
||||||
|
final Pointer<CIsarInstance> ptr;
|
||||||
|
late final NativeFinalizer _finalizer;
|
||||||
|
|
||||||
|
final offsets = <Type, List<int>>{};
|
||||||
|
|
||||||
|
final Pointer<Pointer<CIsarTxn>> _syncTxnPtrPtr = malloc<Pointer<CIsarTxn>>();
|
||||||
|
|
||||||
|
String? _directory;
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get directory {
|
||||||
|
requireOpen();
|
||||||
|
|
||||||
|
if (_directory == null) {
|
||||||
|
final dirPtr = IC.isar_instance_get_path(ptr);
|
||||||
|
try {
|
||||||
|
_directory = dirPtr.cast<Utf8>().toDartString();
|
||||||
|
} finally {
|
||||||
|
IC.isar_free_string(dirPtr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return _directory!;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<Transaction> beginTxn(bool write, bool silent) async {
|
||||||
|
final port = ReceivePort();
|
||||||
|
final portStream = wrapIsarPort(port);
|
||||||
|
|
||||||
|
final txnPtrPtr = malloc<Pointer<CIsarTxn>>();
|
||||||
|
IC.isar_txn_begin(
|
||||||
|
ptr,
|
||||||
|
txnPtrPtr,
|
||||||
|
false,
|
||||||
|
write,
|
||||||
|
silent,
|
||||||
|
port.sendPort.nativePort,
|
||||||
|
);
|
||||||
|
|
||||||
|
final txn = Txn.async(this, txnPtrPtr.value, write, portStream);
|
||||||
|
await txn.wait();
|
||||||
|
return txn;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Transaction beginTxnSync(bool write, bool silent) {
|
||||||
|
nCall(IC.isar_txn_begin(ptr, _syncTxnPtrPtr, true, write, silent, 0));
|
||||||
|
return Txn.sync(this, _syncTxnPtrPtr.value, write);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool performClose(bool deleteFromDisk) {
|
||||||
|
_finalizer.detach(this);
|
||||||
|
if (deleteFromDisk) {
|
||||||
|
return IC.isar_instance_close_and_delete(ptr);
|
||||||
|
} else {
|
||||||
|
return IC.isar_instance_close(ptr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> getSize({
|
||||||
|
bool includeIndexes = false,
|
||||||
|
bool includeLinks = false,
|
||||||
|
}) {
|
||||||
|
return getTxn(false, (Txn txn) async {
|
||||||
|
final sizePtr = txn.alloc<Int64>();
|
||||||
|
IC.isar_instance_get_size(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
includeIndexes,
|
||||||
|
includeLinks,
|
||||||
|
sizePtr,
|
||||||
|
);
|
||||||
|
await txn.wait();
|
||||||
|
return sizePtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int getSizeSync({bool includeIndexes = false, bool includeLinks = false}) {
|
||||||
|
return getTxnSync(false, (Txn txn) {
|
||||||
|
final sizePtr = txn.alloc<Int64>();
|
||||||
|
nCall(
|
||||||
|
IC.isar_instance_get_size(
|
||||||
|
ptr,
|
||||||
|
txn.ptr,
|
||||||
|
includeIndexes,
|
||||||
|
includeLinks,
|
||||||
|
sizePtr,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
return sizePtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> copyToFile(String targetPath) async {
|
||||||
|
final pathPtr = targetPath.toCString(malloc);
|
||||||
|
final receivePort = ReceivePort();
|
||||||
|
final nativePort = receivePort.sendPort.nativePort;
|
||||||
|
|
||||||
|
try {
|
||||||
|
final stream = wrapIsarPort(receivePort);
|
||||||
|
IC.isar_instance_copy_to_file(ptr, pathPtr, nativePort);
|
||||||
|
await stream.first;
|
||||||
|
} finally {
|
||||||
|
malloc.free(pathPtr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> verify() async {
|
||||||
|
return getTxn(false, (Txn txn) async {
|
||||||
|
IC.isar_instance_verify(ptr, txn.ptr);
|
||||||
|
await txn.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
121
lib/src/native/isar_link_impl.dart
Normal file
121
lib/src/native/isar_link_impl.dart
Normal file
@ -0,0 +1,121 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:ffi';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/common/isar_link_base_impl.dart';
|
||||||
|
import 'package:isar/src/common/isar_link_common.dart';
|
||||||
|
import 'package:isar/src/common/isar_links_common.dart';
|
||||||
|
import 'package:isar/src/native/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/txn.dart';
|
||||||
|
|
||||||
|
mixin IsarLinkBaseMixin<OBJ> on IsarLinkBaseImpl<OBJ> {
|
||||||
|
@override
|
||||||
|
IsarCollectionImpl<dynamic> get sourceCollection =>
|
||||||
|
super.sourceCollection as IsarCollectionImpl;
|
||||||
|
|
||||||
|
@override
|
||||||
|
IsarCollectionImpl<OBJ> get targetCollection =>
|
||||||
|
super.targetCollection as IsarCollectionImpl<OBJ>;
|
||||||
|
|
||||||
|
late final int linkId = sourceCollection.schema.link(linkName).id;
|
||||||
|
|
||||||
|
@override
|
||||||
|
late final getId = targetCollection.schema.getId;
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> update({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
}) {
|
||||||
|
final linkList = link.toList();
|
||||||
|
final unlinkList = unlink.toList();
|
||||||
|
|
||||||
|
final containingId = requireAttached();
|
||||||
|
return targetCollection.isar.getTxn(true, (Txn txn) {
|
||||||
|
final count = linkList.length + unlinkList.length;
|
||||||
|
final idsPtr = txn.alloc<Int64>(count);
|
||||||
|
final ids = idsPtr.asTypedList(count);
|
||||||
|
|
||||||
|
for (var i = 0; i < linkList.length; i++) {
|
||||||
|
ids[i] = requireGetId(linkList[i]);
|
||||||
|
}
|
||||||
|
for (var i = 0; i < unlinkList.length; i++) {
|
||||||
|
ids[linkList.length + i] = requireGetId(unlinkList[i]);
|
||||||
|
}
|
||||||
|
|
||||||
|
IC.isar_link_update_all(
|
||||||
|
sourceCollection.ptr,
|
||||||
|
txn.ptr,
|
||||||
|
linkId,
|
||||||
|
containingId,
|
||||||
|
idsPtr,
|
||||||
|
linkList.length,
|
||||||
|
unlinkList.length,
|
||||||
|
reset,
|
||||||
|
);
|
||||||
|
return txn.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void updateSync({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
}) {
|
||||||
|
final containingId = requireAttached();
|
||||||
|
targetCollection.isar.getTxnSync(true, (Txn txn) {
|
||||||
|
if (reset) {
|
||||||
|
nCall(
|
||||||
|
IC.isar_link_unlink_all(
|
||||||
|
sourceCollection.ptr,
|
||||||
|
txn.ptr,
|
||||||
|
linkId,
|
||||||
|
containingId,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
for (final object in link) {
|
||||||
|
var id = getId(object);
|
||||||
|
if (id == Isar.autoIncrement) {
|
||||||
|
id = targetCollection.putByIndexSyncInternal(
|
||||||
|
txn: txn,
|
||||||
|
object: object,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
nCall(
|
||||||
|
IC.isar_link(
|
||||||
|
sourceCollection.ptr,
|
||||||
|
txn.ptr,
|
||||||
|
linkId,
|
||||||
|
containingId,
|
||||||
|
id,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
for (final object in unlink) {
|
||||||
|
final unlinkId = requireGetId(object);
|
||||||
|
nCall(
|
||||||
|
IC.isar_link_unlink(
|
||||||
|
sourceCollection.ptr,
|
||||||
|
txn.ptr,
|
||||||
|
linkId,
|
||||||
|
containingId,
|
||||||
|
unlinkId,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
class IsarLinkImpl<OBJ> extends IsarLinkCommon<OBJ>
|
||||||
|
with IsarLinkBaseMixin<OBJ> {}
|
||||||
|
|
||||||
|
class IsarLinksImpl<OBJ> extends IsarLinksCommon<OBJ>
|
||||||
|
with IsarLinkBaseMixin<OBJ> {}
|
591
lib/src/native/isar_reader_impl.dart
Normal file
591
lib/src/native/isar_reader_impl.dart
Normal file
@ -0,0 +1,591 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:meta/meta.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class IsarReaderImpl implements IsarReader {
|
||||||
|
IsarReaderImpl(this._buffer)
|
||||||
|
: _byteData = ByteData.view(_buffer.buffer, _buffer.offsetInBytes) {
|
||||||
|
_staticSize = _byteData.getUint16(0, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
static const Utf8Decoder utf8Decoder = Utf8Decoder();
|
||||||
|
|
||||||
|
final Uint8List _buffer;
|
||||||
|
final ByteData _byteData;
|
||||||
|
late int _staticSize;
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
bool _readBool(int offset) {
|
||||||
|
final value = _buffer[offset];
|
||||||
|
if (value == trueBool) {
|
||||||
|
return true;
|
||||||
|
} else {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
bool readBool(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return _readBool(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
bool? _readBoolOrNull(int offset) {
|
||||||
|
final value = _buffer[offset];
|
||||||
|
if (value == trueBool) {
|
||||||
|
return true;
|
||||||
|
} else if (value == falseBool) {
|
||||||
|
return false;
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
bool? readBoolOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return _readBoolOrNull(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
int readByte(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
return _buffer[offset];
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
int? readByteOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return _buffer[offset];
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
int readInt(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return nullInt;
|
||||||
|
}
|
||||||
|
return _byteData.getInt32(offset, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
int? _readIntOrNull(int offset) {
|
||||||
|
final value = _byteData.getInt32(offset, Endian.little);
|
||||||
|
if (value != nullInt) {
|
||||||
|
return value;
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
int? readIntOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return _readIntOrNull(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
double readFloat(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return nullDouble;
|
||||||
|
}
|
||||||
|
return _byteData.getFloat32(offset, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
double? _readFloatOrNull(int offset) {
|
||||||
|
final value = _byteData.getFloat32(offset, Endian.little);
|
||||||
|
if (!value.isNaN) {
|
||||||
|
return value;
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
double? readFloatOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return _readFloatOrNull(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
int readLong(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return nullLong;
|
||||||
|
}
|
||||||
|
return _byteData.getInt64(offset, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
int? _readLongOrNull(int offset) {
|
||||||
|
final value = _byteData.getInt64(offset, Endian.little);
|
||||||
|
if (value != nullLong) {
|
||||||
|
return value;
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
int? readLongOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return _readLongOrNull(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
double readDouble(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return nullDouble;
|
||||||
|
}
|
||||||
|
return _byteData.getFloat64(offset, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
double? _readDoubleOrNull(int offset) {
|
||||||
|
final value = _byteData.getFloat64(offset, Endian.little);
|
||||||
|
if (!value.isNaN) {
|
||||||
|
return value;
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
double? readDoubleOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return _readDoubleOrNull(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
DateTime readDateTime(int offset) {
|
||||||
|
final time = readLongOrNull(offset);
|
||||||
|
return time != null
|
||||||
|
? DateTime.fromMicrosecondsSinceEpoch(time, isUtc: true).toLocal()
|
||||||
|
: nullDate;
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
DateTime? readDateTimeOrNull(int offset) {
|
||||||
|
final time = readLongOrNull(offset);
|
||||||
|
if (time != null) {
|
||||||
|
return DateTime.fromMicrosecondsSinceEpoch(time, isUtc: true).toLocal();
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
int _readUint24(int offset) {
|
||||||
|
return _buffer[offset] |
|
||||||
|
_buffer[offset + 1] << 8 |
|
||||||
|
_buffer[offset + 2] << 16;
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
String readString(int offset) {
|
||||||
|
return readStringOrNull(offset) ?? '';
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
String? readStringOrNull(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var bytesOffset = _readUint24(offset);
|
||||||
|
if (bytesOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(bytesOffset);
|
||||||
|
bytesOffset += 3;
|
||||||
|
|
||||||
|
return utf8Decoder.convert(_buffer, bytesOffset, bytesOffset + length);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
@override
|
||||||
|
T? readObjectOrNull<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var bytesOffset = _readUint24(offset);
|
||||||
|
if (bytesOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(bytesOffset);
|
||||||
|
bytesOffset += 3;
|
||||||
|
|
||||||
|
final buffer =
|
||||||
|
Uint8List.sublistView(_buffer, bytesOffset, bytesOffset + length);
|
||||||
|
final reader = IsarReaderImpl(buffer);
|
||||||
|
final offsets = allOffsets[T]!;
|
||||||
|
return deserialize(0, reader, offsets, allOffsets);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<bool>? readBoolList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List<bool>.filled(length, false);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _readBool(listOffset + i);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<bool?>? readBoolOrNullList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List<bool?>.filled(length, null);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _readBoolOrNull(listOffset + i);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int>? readByteList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
return _buffer.sublist(listOffset, listOffset + length);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int>? readIntList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = Int32List(length);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _byteData.getInt32(listOffset + i * 4, Endian.little);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int?>? readIntOrNullList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List<int?>.filled(length, null);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _readIntOrNull(listOffset + i * 4);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<double>? readFloatList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = Float32List(length);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _byteData.getFloat32(listOffset + i * 4, Endian.little);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<double?>? readFloatOrNullList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List<double?>.filled(length, null);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _readFloatOrNull(listOffset + i * 4);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int>? readLongList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = Int64List(length);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _byteData.getInt64(listOffset + i * 8, Endian.little);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int?>? readLongOrNullList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List<int?>.filled(length, null);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _readLongOrNull(listOffset + i * 8);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<double>? readDoubleList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = Float64List(length);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _byteData.getFloat64(listOffset + i * 8, Endian.little);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<double?>? readDoubleOrNullList(int offset) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List<double?>.filled(length, null);
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
list[i] = _readDoubleOrNull(listOffset + i * 8);
|
||||||
|
}
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<DateTime>? readDateTimeList(int offset) {
|
||||||
|
return readLongOrNullList(offset)?.map((e) {
|
||||||
|
if (e != null) {
|
||||||
|
return DateTime.fromMicrosecondsSinceEpoch(e, isUtc: true).toLocal();
|
||||||
|
} else {
|
||||||
|
return nullDate;
|
||||||
|
}
|
||||||
|
}).toList();
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<DateTime?>? readDateTimeOrNullList(int offset) {
|
||||||
|
return readLongOrNullList(offset)?.map((e) {
|
||||||
|
if (e != null) {
|
||||||
|
return DateTime.fromMicrosecondsSinceEpoch(e, isUtc: true).toLocal();
|
||||||
|
}
|
||||||
|
}).toList();
|
||||||
|
}
|
||||||
|
|
||||||
|
List<T>? readDynamicList<T>(
|
||||||
|
int offset,
|
||||||
|
T nullValue,
|
||||||
|
T Function(int startOffset, int endOffset) transform,
|
||||||
|
) {
|
||||||
|
if (offset >= _staticSize) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var listOffset = _readUint24(offset);
|
||||||
|
if (listOffset == 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
final length = _readUint24(listOffset);
|
||||||
|
listOffset += 3;
|
||||||
|
|
||||||
|
final list = List.filled(length, nullValue);
|
||||||
|
var contentOffset = listOffset + length * 3;
|
||||||
|
for (var i = 0; i < length; i++) {
|
||||||
|
final itemSize = _readUint24(listOffset + i * 3);
|
||||||
|
|
||||||
|
if (itemSize != 0) {
|
||||||
|
list[i] = transform(contentOffset, contentOffset + itemSize - 1);
|
||||||
|
contentOffset += itemSize - 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return list;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<String>? readStringList(int offset) {
|
||||||
|
return readDynamicList(offset, '', (startOffset, endOffset) {
|
||||||
|
return utf8Decoder.convert(_buffer, startOffset, endOffset);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<String?>? readStringOrNullList(int offset) {
|
||||||
|
return readDynamicList(offset, null, (startOffset, endOffset) {
|
||||||
|
return utf8Decoder.convert(_buffer, startOffset, endOffset);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<T>? readObjectList<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
T defaultValue,
|
||||||
|
) {
|
||||||
|
final offsets = allOffsets[T]!;
|
||||||
|
return readDynamicList(offset, defaultValue, (startOffset, endOffset) {
|
||||||
|
final buffer = Uint8List.sublistView(_buffer, startOffset, endOffset);
|
||||||
|
final reader = IsarReaderImpl(buffer);
|
||||||
|
return deserialize(0, reader, offsets, allOffsets);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<T?>? readObjectOrNullList<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
) {
|
||||||
|
final offsets = allOffsets[T]!;
|
||||||
|
return readDynamicList(offset, null, (startOffset, endOffset) {
|
||||||
|
final buffer = Uint8List.sublistView(_buffer, startOffset, endOffset);
|
||||||
|
final reader = IsarReaderImpl(buffer);
|
||||||
|
return deserialize(0, reader, offsets, allOffsets);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
284
lib/src/native/isar_writer_impl.dart
Normal file
284
lib/src/native/isar_writer_impl.dart
Normal file
@ -0,0 +1,284 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs, prefer_asserts_with_message,
|
||||||
|
// avoid_positional_boolean_parameters
|
||||||
|
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:meta/meta.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class IsarWriterImpl implements IsarWriter {
|
||||||
|
IsarWriterImpl(Uint8List buffer, int staticSize)
|
||||||
|
: _dynamicOffset = staticSize,
|
||||||
|
_buffer = buffer,
|
||||||
|
_byteData = ByteData.view(buffer.buffer, buffer.offsetInBytes) {
|
||||||
|
_byteData.setUint16(0, staticSize, Endian.little);
|
||||||
|
|
||||||
|
// Required because we don't want to persist uninitialized memory.
|
||||||
|
for (var i = 2; i < staticSize; i++) {
|
||||||
|
_buffer[i] = 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final Uint8List _buffer;
|
||||||
|
|
||||||
|
final ByteData _byteData;
|
||||||
|
|
||||||
|
int _dynamicOffset;
|
||||||
|
|
||||||
|
int get usedBytes => _dynamicOffset;
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeBool(int offset, bool? value) {
|
||||||
|
_buffer[offset] = value.byteValue;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeByte(int offset, int value) {
|
||||||
|
assert(value >= minByte && value <= maxByte);
|
||||||
|
_buffer[offset] = value;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeInt(int offset, int? value) {
|
||||||
|
value ??= nullInt;
|
||||||
|
assert(value >= minInt && value <= maxInt);
|
||||||
|
_byteData.setInt32(offset, value, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeFloat(int offset, double? value) {
|
||||||
|
_byteData.setFloat32(offset, value ?? double.nan, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeLong(int offset, int? value) {
|
||||||
|
_byteData.setInt64(offset, value ?? nullLong, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeDouble(int offset, double? value) {
|
||||||
|
_byteData.setFloat64(offset, value ?? double.nan, Endian.little);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeDateTime(int offset, DateTime? value) {
|
||||||
|
writeLong(offset, value?.toUtc().microsecondsSinceEpoch);
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void _writeUint24(int offset, int value) {
|
||||||
|
_buffer[offset] = value;
|
||||||
|
_buffer[offset + 1] = value >> 8;
|
||||||
|
_buffer[offset + 2] = value >> 16;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeString(int offset, String? value) {
|
||||||
|
if (value != null) {
|
||||||
|
final byteCount = encodeString(value, _buffer, _dynamicOffset + 3);
|
||||||
|
_writeUint24(offset, _dynamicOffset);
|
||||||
|
_writeUint24(_dynamicOffset, byteCount);
|
||||||
|
_dynamicOffset += byteCount + 3;
|
||||||
|
} else {
|
||||||
|
_writeUint24(offset, 0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeObject<T>(
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
Serialize<T> serialize,
|
||||||
|
T? value,
|
||||||
|
) {
|
||||||
|
if (value != null) {
|
||||||
|
final buffer = Uint8List.sublistView(_buffer, _dynamicOffset + 3);
|
||||||
|
final offsets = allOffsets[T]!;
|
||||||
|
final binaryWriter = IsarWriterImpl(buffer, offsets.last);
|
||||||
|
serialize(value, binaryWriter, offsets, allOffsets);
|
||||||
|
final byteCount = binaryWriter.usedBytes;
|
||||||
|
_writeUint24(offset, _dynamicOffset);
|
||||||
|
_writeUint24(_dynamicOffset, byteCount);
|
||||||
|
_dynamicOffset += byteCount + 3;
|
||||||
|
} else {
|
||||||
|
_writeUint24(offset, 0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void _writeListOffset(int offset, int? length) {
|
||||||
|
if (length == null) {
|
||||||
|
_writeUint24(offset, 0);
|
||||||
|
} else {
|
||||||
|
_writeUint24(offset, _dynamicOffset);
|
||||||
|
_writeUint24(_dynamicOffset, length);
|
||||||
|
_dynamicOffset += 3;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
void writeByteList(int offset, List<int>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
_buffer[_dynamicOffset++] = values[i];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeBoolList(int offset, List<bool?>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
_buffer[_dynamicOffset++] = values[i].byteValue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeIntList(int offset, List<int?>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
for (var value in values) {
|
||||||
|
value ??= nullInt;
|
||||||
|
assert(value >= minInt && value <= maxInt);
|
||||||
|
_byteData.setUint32(_dynamicOffset, value, Endian.little);
|
||||||
|
_dynamicOffset += 4;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeFloatList(int offset, List<double?>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
_byteData.setFloat32(
|
||||||
|
_dynamicOffset,
|
||||||
|
values[i] ?? nullFloat,
|
||||||
|
Endian.little,
|
||||||
|
);
|
||||||
|
_dynamicOffset += 4;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeLongList(int offset, List<int?>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
_byteData.setInt64(
|
||||||
|
_dynamicOffset,
|
||||||
|
values[i] ?? nullLong,
|
||||||
|
Endian.little,
|
||||||
|
);
|
||||||
|
_dynamicOffset += 8;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeDoubleList(int offset, List<double?>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
_byteData.setFloat64(
|
||||||
|
_dynamicOffset,
|
||||||
|
values[i] ?? nullDouble,
|
||||||
|
Endian.little,
|
||||||
|
);
|
||||||
|
_dynamicOffset += 8;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeDateTimeList(int offset, List<DateTime?>? values) {
|
||||||
|
final longList = values?.map((e) => e.longValue).toList();
|
||||||
|
writeLongList(offset, longList);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeStringList(int offset, List<String?>? values) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
final offsetListOffset = _dynamicOffset;
|
||||||
|
_dynamicOffset += values.length * 3;
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
final value = values[i];
|
||||||
|
if (value != null) {
|
||||||
|
final byteCount = encodeString(value, _buffer, _dynamicOffset);
|
||||||
|
_writeUint24(offsetListOffset + i * 3, byteCount + 1);
|
||||||
|
_dynamicOffset += byteCount;
|
||||||
|
} else {
|
||||||
|
_writeUint24(offsetListOffset + i * 3, 0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void writeObjectList<T>(
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
Serialize<T> serialize,
|
||||||
|
List<T?>? values,
|
||||||
|
) {
|
||||||
|
_writeListOffset(offset, values?.length);
|
||||||
|
|
||||||
|
if (values != null) {
|
||||||
|
final offsetListOffset = _dynamicOffset;
|
||||||
|
_dynamicOffset += values.length * 3;
|
||||||
|
|
||||||
|
final offsets = allOffsets[T]!;
|
||||||
|
final staticSize = offsets.last;
|
||||||
|
for (var i = 0; i < values.length; i++) {
|
||||||
|
final value = values[i];
|
||||||
|
if (value != null) {
|
||||||
|
final buffer = Uint8List.sublistView(_buffer, _dynamicOffset);
|
||||||
|
final binaryWriter = IsarWriterImpl(buffer, staticSize);
|
||||||
|
serialize(value, binaryWriter, offsets, allOffsets);
|
||||||
|
final byteCount = binaryWriter.usedBytes;
|
||||||
|
_writeUint24(offsetListOffset + i * 3, byteCount + 1);
|
||||||
|
_dynamicOffset += byteCount;
|
||||||
|
} else {
|
||||||
|
_writeUint24(offsetListOffset + i * 3, 0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
extension IsarBoolValue on bool? {
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
int get byteValue =>
|
||||||
|
this == null ? nullBool : (this == true ? trueBool : falseBool);
|
||||||
|
}
|
||||||
|
|
||||||
|
extension IsarDateTimeValue on DateTime? {
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
int get longValue => this?.toUtc().microsecondsSinceEpoch ?? nullLong;
|
||||||
|
}
|
159
lib/src/native/open.dart
Normal file
159
lib/src/native/open.dart
Normal file
@ -0,0 +1,159 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs, invalid_use_of_protected_member
|
||||||
|
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:ffi';
|
||||||
|
import 'dart:isolate';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/common/schemas.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/isar_impl.dart';
|
||||||
|
|
||||||
|
final Pointer<Pointer<CIsarInstance>> _isarPtrPtr =
|
||||||
|
malloc<Pointer<CIsarInstance>>();
|
||||||
|
|
||||||
|
List<int> _getOffsets(
|
||||||
|
Pointer<CIsarCollection> colPtr,
|
||||||
|
int propertiesCount,
|
||||||
|
int embeddedColId,
|
||||||
|
) {
|
||||||
|
final offsetsPtr = malloc<Uint32>(propertiesCount);
|
||||||
|
final staticSize = IC.isar_get_offsets(colPtr, embeddedColId, offsetsPtr);
|
||||||
|
final offsets = offsetsPtr.asTypedList(propertiesCount).toList();
|
||||||
|
offsets.add(staticSize);
|
||||||
|
malloc.free(offsetsPtr);
|
||||||
|
return offsets;
|
||||||
|
}
|
||||||
|
|
||||||
|
void _initializeInstance(
|
||||||
|
IsarImpl isar,
|
||||||
|
List<CollectionSchema<dynamic>> schemas,
|
||||||
|
) {
|
||||||
|
final colPtrPtr = malloc<Pointer<CIsarCollection>>();
|
||||||
|
|
||||||
|
final cols = <Type, IsarCollection<dynamic>>{};
|
||||||
|
for (final schema in schemas) {
|
||||||
|
nCall(IC.isar_instance_get_collection(isar.ptr, colPtrPtr, schema.id));
|
||||||
|
|
||||||
|
final offsets = _getOffsets(colPtrPtr.value, schema.properties.length, 0);
|
||||||
|
|
||||||
|
for (final embeddedSchema in schema.embeddedSchemas.values) {
|
||||||
|
final embeddedType = embeddedSchema.type;
|
||||||
|
if (!isar.offsets.containsKey(embeddedType)) {
|
||||||
|
final offsets = _getOffsets(
|
||||||
|
colPtrPtr.value,
|
||||||
|
embeddedSchema.properties.length,
|
||||||
|
embeddedSchema.id,
|
||||||
|
);
|
||||||
|
isar.offsets[embeddedType] = offsets;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
schema.toCollection(<OBJ>() {
|
||||||
|
isar.offsets[OBJ] = offsets;
|
||||||
|
|
||||||
|
schema as CollectionSchema<OBJ>;
|
||||||
|
cols[OBJ] = IsarCollectionImpl<OBJ>(
|
||||||
|
isar: isar,
|
||||||
|
ptr: colPtrPtr.value,
|
||||||
|
schema: schema,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
malloc.free(colPtrPtr);
|
||||||
|
|
||||||
|
isar.attachCollections(cols);
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<Isar> openIsar({
|
||||||
|
required List<CollectionSchema<dynamic>> schemas,
|
||||||
|
required String directory,
|
||||||
|
required String name,
|
||||||
|
required int maxSizeMiB,
|
||||||
|
required bool relaxedDurability,
|
||||||
|
CompactCondition? compactOnLaunch,
|
||||||
|
}) async {
|
||||||
|
initializeCoreBinary();
|
||||||
|
IC.isar_connect_dart_api(NativeApi.postCObject.cast());
|
||||||
|
|
||||||
|
return using((Arena alloc) async {
|
||||||
|
final namePtr = name.toCString(alloc);
|
||||||
|
final dirPtr = directory.toCString(alloc);
|
||||||
|
|
||||||
|
final schemasJson = getSchemas(schemas).map((e) => e.toJson());
|
||||||
|
final schemaStrPtr = jsonEncode(schemasJson.toList()).toCString(alloc);
|
||||||
|
|
||||||
|
final compactMinFileSize = compactOnLaunch?.minFileSize;
|
||||||
|
final compactMinBytes = compactOnLaunch?.minBytes;
|
||||||
|
final compactMinRatio =
|
||||||
|
compactOnLaunch == null ? double.nan : compactOnLaunch.minRatio;
|
||||||
|
|
||||||
|
final receivePort = ReceivePort();
|
||||||
|
final nativePort = receivePort.sendPort.nativePort;
|
||||||
|
final stream = wrapIsarPort(receivePort);
|
||||||
|
IC.isar_instance_create_async(
|
||||||
|
_isarPtrPtr,
|
||||||
|
namePtr,
|
||||||
|
dirPtr,
|
||||||
|
schemaStrPtr,
|
||||||
|
maxSizeMiB,
|
||||||
|
relaxedDurability,
|
||||||
|
compactMinFileSize ?? 0,
|
||||||
|
compactMinBytes ?? 0,
|
||||||
|
compactMinRatio ?? 0,
|
||||||
|
nativePort,
|
||||||
|
);
|
||||||
|
await stream.first;
|
||||||
|
|
||||||
|
final isar = IsarImpl(name, _isarPtrPtr.value);
|
||||||
|
_initializeInstance(isar, schemas);
|
||||||
|
return isar;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
Isar openIsarSync({
|
||||||
|
required List<CollectionSchema<dynamic>> schemas,
|
||||||
|
required String directory,
|
||||||
|
required String name,
|
||||||
|
required int maxSizeMiB,
|
||||||
|
required bool relaxedDurability,
|
||||||
|
CompactCondition? compactOnLaunch,
|
||||||
|
}) {
|
||||||
|
initializeCoreBinary();
|
||||||
|
IC.isar_connect_dart_api(NativeApi.postCObject.cast());
|
||||||
|
return using((Arena alloc) {
|
||||||
|
final namePtr = name.toCString(alloc);
|
||||||
|
final dirPtr = directory.toCString(alloc);
|
||||||
|
|
||||||
|
final schemasJson = getSchemas(schemas).map((e) => e.toJson());
|
||||||
|
final schemaStrPtr = jsonEncode(schemasJson.toList()).toCString(alloc);
|
||||||
|
|
||||||
|
final compactMinFileSize = compactOnLaunch?.minFileSize;
|
||||||
|
final compactMinBytes = compactOnLaunch?.minBytes;
|
||||||
|
final compactMinRatio =
|
||||||
|
compactOnLaunch == null ? double.nan : compactOnLaunch.minRatio;
|
||||||
|
|
||||||
|
nCall(
|
||||||
|
IC.isar_instance_create(
|
||||||
|
_isarPtrPtr,
|
||||||
|
namePtr,
|
||||||
|
dirPtr,
|
||||||
|
schemaStrPtr,
|
||||||
|
maxSizeMiB,
|
||||||
|
relaxedDurability,
|
||||||
|
compactMinFileSize ?? 0,
|
||||||
|
compactMinBytes ?? 0,
|
||||||
|
compactMinRatio ?? 0,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
|
||||||
|
final isar = IsarImpl(name, _isarPtrPtr.value);
|
||||||
|
_initializeInstance(isar, schemas);
|
||||||
|
return isar;
|
||||||
|
});
|
||||||
|
}
|
1040
lib/src/native/query_build.dart
Normal file
1040
lib/src/native/query_build.dart
Normal file
File diff suppressed because it is too large
Load Diff
261
lib/src/native/query_impl.dart
Normal file
261
lib/src/native/query_impl.dart
Normal file
@ -0,0 +1,261 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:ffi';
|
||||||
|
import 'dart:isolate';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/txn.dart';
|
||||||
|
|
||||||
|
typedef QueryDeserialize<T> = List<T> Function(CObjectSet);
|
||||||
|
|
||||||
|
class QueryImpl<T> extends Query<T> implements Finalizable {
|
||||||
|
QueryImpl(this.col, this.queryPtr, this.deserialize, this.propertyId) {
|
||||||
|
NativeFinalizer(isarQueryFree).attach(this, queryPtr.cast());
|
||||||
|
}
|
||||||
|
static const int maxLimit = 4294967295;
|
||||||
|
|
||||||
|
final IsarCollectionImpl<dynamic> col;
|
||||||
|
final Pointer<CQuery> queryPtr;
|
||||||
|
final QueryDeserialize<T> deserialize;
|
||||||
|
final int? propertyId;
|
||||||
|
|
||||||
|
@override
|
||||||
|
Isar get isar => col.isar;
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<T?> findFirst() {
|
||||||
|
return findInternal(maxLimit).then((List<T> result) {
|
||||||
|
if (result.isNotEmpty) {
|
||||||
|
return result[0];
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<T>> findAll() => findInternal(maxLimit);
|
||||||
|
|
||||||
|
Future<List<T>> findInternal(int limit) {
|
||||||
|
return col.isar.getTxn(false, (Txn txn) async {
|
||||||
|
final resultsPtr = txn.alloc<CObjectSet>();
|
||||||
|
try {
|
||||||
|
IC.isar_q_find(queryPtr, txn.ptr, resultsPtr, limit);
|
||||||
|
await txn.wait();
|
||||||
|
return deserialize(resultsPtr.ref).cast();
|
||||||
|
} finally {
|
||||||
|
IC.isar_free_c_object_set(resultsPtr);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
T? findFirstSync() {
|
||||||
|
final results = findSyncInternal(1);
|
||||||
|
if (results.isNotEmpty) {
|
||||||
|
return results[0];
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<T> findAllSync() => findSyncInternal(maxLimit);
|
||||||
|
|
||||||
|
List<T> findSyncInternal(int limit) {
|
||||||
|
return col.isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final resultsPtr = txn.getCObjectsSet();
|
||||||
|
try {
|
||||||
|
nCall(IC.isar_q_find(queryPtr, txn.ptr, resultsPtr, limit));
|
||||||
|
return deserialize(resultsPtr.ref).cast();
|
||||||
|
} finally {
|
||||||
|
IC.isar_free_c_object_set(resultsPtr);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<bool> deleteFirst() =>
|
||||||
|
deleteInternal(1).then((int count) => count == 1);
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> deleteAll() => deleteInternal(maxLimit);
|
||||||
|
|
||||||
|
Future<int> deleteInternal(int limit) {
|
||||||
|
return col.isar.getTxn(false, (Txn txn) async {
|
||||||
|
final countPtr = txn.alloc<Uint32>();
|
||||||
|
IC.isar_q_delete(queryPtr, col.ptr, txn.ptr, limit, countPtr);
|
||||||
|
await txn.wait();
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool deleteFirstSync() => deleteSyncInternal(1) == 1;
|
||||||
|
|
||||||
|
@override
|
||||||
|
int deleteAllSync() => deleteSyncInternal(maxLimit);
|
||||||
|
|
||||||
|
int deleteSyncInternal(int limit) {
|
||||||
|
return col.isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final countPtr = txn.alloc<Uint32>();
|
||||||
|
nCall(IC.isar_q_delete(queryPtr, col.ptr, txn.ptr, limit, countPtr));
|
||||||
|
return countPtr.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<List<T>> watch({bool fireImmediately = false}) {
|
||||||
|
return watchLazy(fireImmediately: fireImmediately)
|
||||||
|
.asyncMap((event) => findAll());
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false}) {
|
||||||
|
final port = ReceivePort();
|
||||||
|
final handle = IC.isar_watch_query(
|
||||||
|
col.isar.ptr,
|
||||||
|
col.ptr,
|
||||||
|
queryPtr,
|
||||||
|
port.sendPort.nativePort,
|
||||||
|
);
|
||||||
|
|
||||||
|
final controller = StreamController<void>(
|
||||||
|
onCancel: () {
|
||||||
|
IC.isar_stop_watching(handle);
|
||||||
|
port.close();
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (fireImmediately) {
|
||||||
|
controller.add(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
controller.addStream(port);
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<R> exportJsonRaw<R>(R Function(Uint8List) callback) {
|
||||||
|
return col.isar.getTxn(false, (Txn txn) async {
|
||||||
|
final bytesPtrPtr = txn.alloc<Pointer<Uint8>>();
|
||||||
|
final lengthPtr = txn.alloc<Uint32>();
|
||||||
|
final idNamePtr = col.schema.idName.toCString(txn.alloc);
|
||||||
|
|
||||||
|
nCall(
|
||||||
|
IC.isar_q_export_json(
|
||||||
|
queryPtr,
|
||||||
|
col.ptr,
|
||||||
|
txn.ptr,
|
||||||
|
idNamePtr,
|
||||||
|
bytesPtrPtr,
|
||||||
|
lengthPtr,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
|
||||||
|
try {
|
||||||
|
await txn.wait();
|
||||||
|
final bytes = bytesPtrPtr.value.asTypedList(lengthPtr.value);
|
||||||
|
return callback(bytes);
|
||||||
|
} finally {
|
||||||
|
IC.isar_free_json(bytesPtrPtr.value, lengthPtr.value);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
R exportJsonRawSync<R>(R Function(Uint8List) callback) {
|
||||||
|
return col.isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final bytesPtrPtr = txn.alloc<Pointer<Uint8>>();
|
||||||
|
final lengthPtr = txn.alloc<Uint32>();
|
||||||
|
final idNamePtr = col.schema.idName.toCString(txn.alloc);
|
||||||
|
|
||||||
|
try {
|
||||||
|
nCall(
|
||||||
|
IC.isar_q_export_json(
|
||||||
|
queryPtr,
|
||||||
|
col.ptr,
|
||||||
|
txn.ptr,
|
||||||
|
idNamePtr,
|
||||||
|
bytesPtrPtr,
|
||||||
|
lengthPtr,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
final bytes = bytesPtrPtr.value.asTypedList(lengthPtr.value);
|
||||||
|
return callback(bytes);
|
||||||
|
} finally {
|
||||||
|
IC.isar_free_json(bytesPtrPtr.value, lengthPtr.value);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<R?> aggregate<R>(AggregationOp op) async {
|
||||||
|
return col.isar.getTxn(false, (Txn txn) async {
|
||||||
|
final resultPtrPtr = txn.alloc<Pointer<CAggregationResult>>();
|
||||||
|
|
||||||
|
IC.isar_q_aggregate(
|
||||||
|
col.ptr,
|
||||||
|
queryPtr,
|
||||||
|
txn.ptr,
|
||||||
|
op.index,
|
||||||
|
propertyId ?? 0,
|
||||||
|
resultPtrPtr,
|
||||||
|
);
|
||||||
|
await txn.wait();
|
||||||
|
|
||||||
|
return _convertAggregatedResult<R>(resultPtrPtr.value, op);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
R? aggregateSync<R>(AggregationOp op) {
|
||||||
|
return col.isar.getTxnSync(false, (Txn txn) {
|
||||||
|
final resultPtrPtr = txn.alloc<Pointer<CAggregationResult>>();
|
||||||
|
|
||||||
|
nCall(
|
||||||
|
IC.isar_q_aggregate(
|
||||||
|
col.ptr,
|
||||||
|
queryPtr,
|
||||||
|
txn.ptr,
|
||||||
|
op.index,
|
||||||
|
propertyId ?? 0,
|
||||||
|
resultPtrPtr,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
return _convertAggregatedResult(resultPtrPtr.value, op);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
R? _convertAggregatedResult<R>(
|
||||||
|
Pointer<CAggregationResult> resultPtr,
|
||||||
|
AggregationOp op,
|
||||||
|
) {
|
||||||
|
final nullable = op == AggregationOp.min || op == AggregationOp.max;
|
||||||
|
if (R == int || R == DateTime) {
|
||||||
|
final value = IC.isar_q_aggregate_long_result(resultPtr);
|
||||||
|
if (nullable && value == nullLong) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
if (R == int) {
|
||||||
|
return value as R;
|
||||||
|
} else {
|
||||||
|
return DateTime.fromMicrosecondsSinceEpoch(value, isUtc: true).toLocal()
|
||||||
|
as R;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
final value = IC.isar_q_aggregate_double_result(resultPtr);
|
||||||
|
if (nullable && value.isNaN) {
|
||||||
|
return null;
|
||||||
|
} else {
|
||||||
|
return value as R;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
33
lib/src/native/split_words.dart
Normal file
33
lib/src/native/split_words.dart
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
import 'dart:ffi';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/src/native/encode_string.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/isar_reader_impl.dart';
|
||||||
|
|
||||||
|
// ignore: public_member_api_docs
|
||||||
|
List<String> isarSplitWords(String input) {
|
||||||
|
initializeCoreBinary();
|
||||||
|
|
||||||
|
final bytesPtr = malloc<Uint8>(input.length * 3);
|
||||||
|
final bytes = bytesPtr.asTypedList(input.length * 3);
|
||||||
|
final byteCount = encodeString(input, bytes, 0);
|
||||||
|
|
||||||
|
final wordCountPtr = malloc<Uint32>();
|
||||||
|
final boundariesPtr =
|
||||||
|
IC.isar_find_word_boundaries(bytesPtr.cast(), byteCount, wordCountPtr);
|
||||||
|
final wordCount = wordCountPtr.value;
|
||||||
|
final boundaries = boundariesPtr.asTypedList(wordCount * 2);
|
||||||
|
|
||||||
|
final words = <String>[];
|
||||||
|
for (var i = 0; i < wordCount * 2; i++) {
|
||||||
|
final wordBytes = bytes.sublist(boundaries[i++], boundaries[i]);
|
||||||
|
words.add(IsarReaderImpl.utf8Decoder.convert(wordBytes));
|
||||||
|
}
|
||||||
|
|
||||||
|
IC.isar_free_word_boundaries(boundariesPtr, wordCount);
|
||||||
|
malloc.free(bytesPtr);
|
||||||
|
malloc.free(wordCountPtr);
|
||||||
|
|
||||||
|
return words;
|
||||||
|
}
|
113
lib/src/native/txn.dart
Normal file
113
lib/src/native/txn.dart
Normal file
@ -0,0 +1,113 @@
|
|||||||
|
import 'dart:async';
|
||||||
|
import 'dart:collection';
|
||||||
|
import 'dart:ffi';
|
||||||
|
|
||||||
|
import 'package:ffi/ffi.dart';
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/common/isar_common.dart';
|
||||||
|
import 'package:isar/src/native/bindings.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
class Txn extends Transaction {
|
||||||
|
/// @nodoc
|
||||||
|
Txn.sync(Isar isar, this.ptr, bool write) : super(isar, true, write);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Txn.async(Isar isar, this.ptr, bool write, Stream<void> stream)
|
||||||
|
: super(isar, false, write) {
|
||||||
|
_completers = Queue();
|
||||||
|
_portSubscription = stream.listen(
|
||||||
|
(_) => _completers.removeFirst().complete(),
|
||||||
|
onError: (Object e) => _completers.removeFirst().completeError(e),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool active = true;
|
||||||
|
|
||||||
|
/// An arena allocator that has the same lifetime as this transaction.
|
||||||
|
final alloc = Arena(malloc);
|
||||||
|
|
||||||
|
/// The pointer to the native transaction.
|
||||||
|
final Pointer<CIsarTxn> ptr;
|
||||||
|
Pointer<CObject>? _cObjPtr;
|
||||||
|
Pointer<CObjectSet>? _cObjSetPtr;
|
||||||
|
|
||||||
|
late Pointer<Uint8> _buffer;
|
||||||
|
int _bufferLen = -1;
|
||||||
|
|
||||||
|
late final Queue<Completer<void>> _completers;
|
||||||
|
late final StreamSubscription<void>? _portSubscription;
|
||||||
|
|
||||||
|
/// Get a shared CObject pointer
|
||||||
|
Pointer<CObject> getCObject() {
|
||||||
|
_cObjPtr ??= alloc<CObject>();
|
||||||
|
return _cObjPtr!;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get a shared CObjectSet pointer
|
||||||
|
Pointer<CObjectSet> getCObjectsSet() {
|
||||||
|
_cObjSetPtr ??= alloc();
|
||||||
|
return _cObjSetPtr!;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Allocate a new CObjectSet with the given capacity.
|
||||||
|
Pointer<CObjectSet> newCObjectSet(int length) {
|
||||||
|
final cObjSetPtr = alloc<CObjectSet>();
|
||||||
|
cObjSetPtr.ref
|
||||||
|
..objects = alloc<CObject>(length)
|
||||||
|
..length = length;
|
||||||
|
return cObjSetPtr;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get a shared buffer with at least the specified size.
|
||||||
|
Pointer<Uint8> getBuffer(int size) {
|
||||||
|
if (_bufferLen < size) {
|
||||||
|
final allocSize = (size * 1.3).toInt();
|
||||||
|
_buffer = alloc(allocSize);
|
||||||
|
_bufferLen = allocSize;
|
||||||
|
}
|
||||||
|
return _buffer;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Wait for the latest async operation to complete.
|
||||||
|
Future<void> wait() {
|
||||||
|
final completer = Completer<void>();
|
||||||
|
_completers.add(completer);
|
||||||
|
return completer.future;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> commit() async {
|
||||||
|
active = false;
|
||||||
|
IC.isar_txn_finish(ptr, true);
|
||||||
|
await wait();
|
||||||
|
unawaited(_portSubscription!.cancel());
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void commitSync() {
|
||||||
|
active = false;
|
||||||
|
nCall(IC.isar_txn_finish(ptr, true));
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> abort() async {
|
||||||
|
active = false;
|
||||||
|
IC.isar_txn_finish(ptr, false);
|
||||||
|
await wait();
|
||||||
|
unawaited(_portSubscription!.cancel());
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void abortSync() {
|
||||||
|
active = false;
|
||||||
|
nCall(IC.isar_txn_finish(ptr, false));
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void free() {
|
||||||
|
alloc.releaseAll();
|
||||||
|
}
|
||||||
|
}
|
204
lib/src/query.dart
Normal file
204
lib/src/query.dart
Normal file
@ -0,0 +1,204 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Querying is how you find records that match certain conditions.
|
||||||
|
abstract class Query<T> {
|
||||||
|
/// The default precision for floating point number queries.
|
||||||
|
static const double epsilon = 0.00001;
|
||||||
|
|
||||||
|
/// The corresponding Isar instance.
|
||||||
|
Isar get isar;
|
||||||
|
|
||||||
|
/// {@template query_find_first}
|
||||||
|
/// Find the first object that matches this query or `null` if no object
|
||||||
|
/// matches.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<T?> findFirst();
|
||||||
|
|
||||||
|
/// {@macro query_find_first}
|
||||||
|
T? findFirstSync();
|
||||||
|
|
||||||
|
/// {@template query_find_all}
|
||||||
|
/// Find all objects that match this query.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<List<T>> findAll();
|
||||||
|
|
||||||
|
/// {@macro query_find_all}
|
||||||
|
List<T> findAllSync();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Future<R?> aggregate<R>(AggregationOp op);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
R? aggregateSync<R>(AggregationOp op);
|
||||||
|
|
||||||
|
/// {@template query_count}
|
||||||
|
/// Count how many objects match this query.
|
||||||
|
///
|
||||||
|
/// This operation is much faster than using `findAll().length`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<int> count() =>
|
||||||
|
aggregate<int>(AggregationOp.count).then((int? value) => value!);
|
||||||
|
|
||||||
|
/// {@macro query_count}
|
||||||
|
int countSync() => aggregateSync<int>(AggregationOp.count)!;
|
||||||
|
|
||||||
|
/// {@template query_is_empty}
|
||||||
|
/// Returns `true` if there are no objects that match this query.
|
||||||
|
///
|
||||||
|
/// This operation is faster than using `count() == 0`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<bool> isEmpty() =>
|
||||||
|
aggregate<int>(AggregationOp.isEmpty).then((value) => value == 1);
|
||||||
|
|
||||||
|
/// {@macro query_is_empty}
|
||||||
|
bool isEmptySync() => aggregateSync<int>(AggregationOp.isEmpty) == 1;
|
||||||
|
|
||||||
|
/// {@template query_is_not_empty}
|
||||||
|
/// Returns `true` if there are objects that match this query.
|
||||||
|
///
|
||||||
|
/// This operation is faster than using `count() > 0`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<bool> isNotEmpty() =>
|
||||||
|
aggregate<int>(AggregationOp.isEmpty).then((value) => value == 0);
|
||||||
|
|
||||||
|
/// {@macro query_is_not_empty}
|
||||||
|
bool isNotEmptySync() => aggregateSync<int>(AggregationOp.isEmpty) == 0;
|
||||||
|
|
||||||
|
/// {@template query_delete_first}
|
||||||
|
/// Delete the first object that matches this query. Returns whether a object
|
||||||
|
/// has been deleted.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<bool> deleteFirst();
|
||||||
|
|
||||||
|
/// {@macro query_delete_first}
|
||||||
|
bool deleteFirstSync();
|
||||||
|
|
||||||
|
/// {@template query_delete_all}
|
||||||
|
/// Delete all objects that match this query. Returns the number of deleted
|
||||||
|
/// objects.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<int> deleteAll();
|
||||||
|
|
||||||
|
/// {@macro query_delete_all}
|
||||||
|
int deleteAllSync();
|
||||||
|
|
||||||
|
/// {@template query_watch}
|
||||||
|
/// Create a watcher that yields the results of this query whenever its
|
||||||
|
/// results have (potentially) changed.
|
||||||
|
///
|
||||||
|
/// If you don't always use the results, consider using `watchLazy` and rerun
|
||||||
|
/// the query manually. If [fireImmediately] is `true`, the results will be
|
||||||
|
/// sent to the consumer immediately.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Stream<List<T>> watch({bool fireImmediately = false});
|
||||||
|
|
||||||
|
/// {@template query_watch_lazy}
|
||||||
|
/// Watch the query for changes. If [fireImmediately] is `true`, an event will
|
||||||
|
/// be fired immediately.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false});
|
||||||
|
|
||||||
|
/// {@template query_export_json_raw}
|
||||||
|
/// Export the results of this query as json bytes.
|
||||||
|
///
|
||||||
|
/// **IMPORTANT:** Do not leak the bytes outside the callback. If you need to
|
||||||
|
/// use the bytes outside, create a copy of the `Uint8List`.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<R> exportJsonRaw<R>(R Function(Uint8List) callback);
|
||||||
|
|
||||||
|
/// {@macro query_export_json_raw}
|
||||||
|
R exportJsonRawSync<R>(R Function(Uint8List) callback);
|
||||||
|
|
||||||
|
/// {@template query_export_json}
|
||||||
|
/// Export the results of this query as json.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<List<Map<String, dynamic>>> exportJson() {
|
||||||
|
return exportJsonRaw((Uint8List bytes) {
|
||||||
|
final json = jsonDecode(const Utf8Decoder().convert(bytes)) as List;
|
||||||
|
return json.cast<Map<String, dynamic>>();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// {@macro query_export_json}
|
||||||
|
List<Map<String, dynamic>> exportJsonSync() {
|
||||||
|
return exportJsonRawSync((Uint8List bytes) {
|
||||||
|
final json = jsonDecode(const Utf8Decoder().convert(bytes)) as List;
|
||||||
|
return json.cast<Map<String, dynamic>>();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
enum AggregationOp {
|
||||||
|
/// Finds the smallest value.
|
||||||
|
min,
|
||||||
|
|
||||||
|
/// Finds the largest value.
|
||||||
|
max,
|
||||||
|
|
||||||
|
/// Calculates the sum of all values.
|
||||||
|
sum,
|
||||||
|
|
||||||
|
/// Calculates the average of all values.
|
||||||
|
average,
|
||||||
|
|
||||||
|
/// Counts all values.
|
||||||
|
count,
|
||||||
|
|
||||||
|
/// Returns `true` if the query has no results.
|
||||||
|
isEmpty,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for Queries
|
||||||
|
extension QueryAggregation<T extends num> on Query<T?> {
|
||||||
|
/// {@template aggregation_min}
|
||||||
|
/// Returns the minimum value of this query.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<T?> min() => aggregate<T>(AggregationOp.min);
|
||||||
|
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
T? minSync() => aggregateSync<T>(AggregationOp.min);
|
||||||
|
|
||||||
|
/// {@template aggregation_max}
|
||||||
|
/// Returns the maximum value of this query.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<T?> max() => aggregate<T>(AggregationOp.max);
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
T? maxSync() => aggregateSync<T>(AggregationOp.max);
|
||||||
|
|
||||||
|
/// {@template aggregation_average}
|
||||||
|
/// Returns the average value of this query.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<double> average() =>
|
||||||
|
aggregate<double>(AggregationOp.average).then((double? value) => value!);
|
||||||
|
|
||||||
|
/// {@macro aggregation_average}
|
||||||
|
double averageSync() => aggregateSync<double>(AggregationOp.average)!;
|
||||||
|
|
||||||
|
/// {@template aggregation_sum}
|
||||||
|
/// Returns the sum of all values of this query.
|
||||||
|
/// {@endtemplate}
|
||||||
|
Future<T> sum() => aggregate<T>(AggregationOp.sum).then((value) => value!);
|
||||||
|
|
||||||
|
/// {@macro aggregation_sum}
|
||||||
|
T sumSync() => aggregateSync<T>(AggregationOp.sum)!;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for Queries
|
||||||
|
extension QueryDateAggregation<T extends DateTime?> on Query<T> {
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
Future<DateTime?> min() => aggregate<DateTime>(AggregationOp.min);
|
||||||
|
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
DateTime? minSync() => aggregateSync<DateTime>(AggregationOp.min);
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
Future<DateTime?> max() => aggregate<DateTime>(AggregationOp.max);
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
DateTime? maxSync() => aggregateSync<DateTime>(AggregationOp.max);
|
||||||
|
}
|
403
lib/src/query_builder.dart
Normal file
403
lib/src/query_builder.dart
Normal file
@ -0,0 +1,403 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef FilterQuery<OBJ> = QueryBuilder<OBJ, OBJ, QAfterFilterCondition>
|
||||||
|
Function(QueryBuilder<OBJ, OBJ, QFilterCondition> q);
|
||||||
|
|
||||||
|
/// Query builders are used to create queries in a safe way.
|
||||||
|
///
|
||||||
|
/// Acquire a `QueryBuilder` instance using `collection.where()` or
|
||||||
|
/// `collection.filter()`.
|
||||||
|
class QueryBuilder<OBJ, R, S> {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const QueryBuilder(this._query);
|
||||||
|
|
||||||
|
final QueryBuilderInternal<OBJ> _query;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
static QueryBuilder<OBJ, R, S> apply<OBJ, R, S>(
|
||||||
|
QueryBuilder<OBJ, dynamic, dynamic> qb,
|
||||||
|
QueryBuilderInternal<OBJ> Function(QueryBuilderInternal<OBJ> query)
|
||||||
|
transform,
|
||||||
|
) {
|
||||||
|
return QueryBuilder(transform(qb._query));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QueryBuilderInternal<OBJ> {
|
||||||
|
/// @nodoc
|
||||||
|
const QueryBuilderInternal({
|
||||||
|
this.collection,
|
||||||
|
this.whereClauses = const [],
|
||||||
|
this.whereDistinct = false,
|
||||||
|
this.whereSort = Sort.asc,
|
||||||
|
this.filter = const FilterGroup.and([]),
|
||||||
|
this.filterGroupType = FilterGroupType.and,
|
||||||
|
this.filterNot = false,
|
||||||
|
this.distinctByProperties = const [],
|
||||||
|
this.sortByProperties = const [],
|
||||||
|
this.offset,
|
||||||
|
this.limit,
|
||||||
|
this.propertyName,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final IsarCollection<OBJ>? collection;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final List<WhereClause> whereClauses;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final bool whereDistinct;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final Sort whereSort;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final FilterGroup filter;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final FilterGroupType filterGroupType;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final bool filterNot;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final List<DistinctProperty> distinctByProperties;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final List<SortProperty> sortByProperties;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final int? offset;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final int? limit;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final String? propertyName;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> addFilterCondition(FilterOperation cond) {
|
||||||
|
if (filterNot) {
|
||||||
|
cond = FilterGroup.not(cond);
|
||||||
|
}
|
||||||
|
|
||||||
|
late FilterGroup filterGroup;
|
||||||
|
|
||||||
|
if (filter.type == filterGroupType || filter.filters.length <= 1) {
|
||||||
|
filterGroup = FilterGroup(
|
||||||
|
type: filterGroupType,
|
||||||
|
filters: [...filter.filters, cond],
|
||||||
|
);
|
||||||
|
} else if (filterGroupType == FilterGroupType.and) {
|
||||||
|
filterGroup = FilterGroup(
|
||||||
|
type: filter.type,
|
||||||
|
filters: [
|
||||||
|
...filter.filters.sublist(0, filter.filters.length - 1),
|
||||||
|
FilterGroup(
|
||||||
|
type: filterGroupType,
|
||||||
|
filters: [filter.filters.last, cond],
|
||||||
|
),
|
||||||
|
],
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
filterGroup = FilterGroup(
|
||||||
|
type: filterGroupType,
|
||||||
|
filters: [filter, cond],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return copyWith(
|
||||||
|
filter: filterGroup,
|
||||||
|
filterGroupType: FilterGroupType.and,
|
||||||
|
filterNot: false,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> addWhereClause(WhereClause where) {
|
||||||
|
return copyWith(whereClauses: [...whereClauses, where]);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> group(FilterQuery<OBJ> q) {
|
||||||
|
// ignore: prefer_const_constructors
|
||||||
|
final qb = q(QueryBuilder(QueryBuilderInternal()));
|
||||||
|
return addFilterCondition(qb._query.filter);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> listLength<E>(
|
||||||
|
String property,
|
||||||
|
int lower,
|
||||||
|
bool includeLower,
|
||||||
|
int upper,
|
||||||
|
bool includeUpper,
|
||||||
|
) {
|
||||||
|
if (!includeLower) {
|
||||||
|
lower += 1;
|
||||||
|
}
|
||||||
|
if (!includeUpper) {
|
||||||
|
if (upper == 0) {
|
||||||
|
lower = 1;
|
||||||
|
} else {
|
||||||
|
upper -= 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return addFilterCondition(
|
||||||
|
FilterCondition.listLength(
|
||||||
|
property: property,
|
||||||
|
lower: lower,
|
||||||
|
upper: upper,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> object<E>(
|
||||||
|
FilterQuery<E> q,
|
||||||
|
String property,
|
||||||
|
) {
|
||||||
|
// ignore: prefer_const_constructors
|
||||||
|
final qb = q(QueryBuilder(QueryBuilderInternal()));
|
||||||
|
return addFilterCondition(
|
||||||
|
ObjectFilter(filter: qb._query.filter, property: property),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> link<E>(
|
||||||
|
FilterQuery<E> q,
|
||||||
|
String linkName,
|
||||||
|
) {
|
||||||
|
// ignore: prefer_const_constructors
|
||||||
|
final qb = q(QueryBuilder(QueryBuilderInternal()));
|
||||||
|
return addFilterCondition(
|
||||||
|
LinkFilter(filter: qb._query.filter, linkName: linkName),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> linkLength<E>(
|
||||||
|
String linkName,
|
||||||
|
int lower,
|
||||||
|
bool includeLower,
|
||||||
|
int upper,
|
||||||
|
bool includeUpper,
|
||||||
|
) {
|
||||||
|
if (!includeLower) {
|
||||||
|
lower += 1;
|
||||||
|
}
|
||||||
|
if (!includeUpper) {
|
||||||
|
if (upper == 0) {
|
||||||
|
lower = 1;
|
||||||
|
} else {
|
||||||
|
upper -= 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return addFilterCondition(
|
||||||
|
LinkFilter.length(
|
||||||
|
lower: lower,
|
||||||
|
upper: upper,
|
||||||
|
linkName: linkName,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> addSortBy(String propertyName, Sort sort) {
|
||||||
|
return copyWith(
|
||||||
|
sortByProperties: [
|
||||||
|
...sortByProperties,
|
||||||
|
SortProperty(property: propertyName, sort: sort),
|
||||||
|
],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> addDistinctBy(
|
||||||
|
String propertyName, {
|
||||||
|
bool? caseSensitive,
|
||||||
|
}) {
|
||||||
|
return copyWith(
|
||||||
|
distinctByProperties: [
|
||||||
|
...distinctByProperties,
|
||||||
|
DistinctProperty(
|
||||||
|
property: propertyName,
|
||||||
|
caseSensitive: caseSensitive,
|
||||||
|
),
|
||||||
|
],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> addPropertyName<E>(String propertyName) {
|
||||||
|
return copyWith(propertyName: propertyName);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
QueryBuilderInternal<OBJ> copyWith({
|
||||||
|
List<WhereClause>? whereClauses,
|
||||||
|
FilterGroup? filter,
|
||||||
|
bool? filterIsGrouped,
|
||||||
|
FilterGroupType? filterGroupType,
|
||||||
|
bool? filterNot,
|
||||||
|
List<FilterGroup>? parentFilters,
|
||||||
|
List<DistinctProperty>? distinctByProperties,
|
||||||
|
List<SortProperty>? sortByProperties,
|
||||||
|
int? offset,
|
||||||
|
int? limit,
|
||||||
|
String? propertyName,
|
||||||
|
}) {
|
||||||
|
assert(offset == null || offset >= 0, 'Invalid offset');
|
||||||
|
assert(limit == null || limit >= 0, 'Invalid limit');
|
||||||
|
return QueryBuilderInternal(
|
||||||
|
collection: collection,
|
||||||
|
whereClauses: whereClauses ?? List.unmodifiable(this.whereClauses),
|
||||||
|
whereDistinct: whereDistinct,
|
||||||
|
whereSort: whereSort,
|
||||||
|
filter: filter ?? this.filter,
|
||||||
|
filterGroupType: filterGroupType ?? this.filterGroupType,
|
||||||
|
filterNot: filterNot ?? this.filterNot,
|
||||||
|
distinctByProperties:
|
||||||
|
distinctByProperties ?? List.unmodifiable(this.distinctByProperties),
|
||||||
|
sortByProperties:
|
||||||
|
sortByProperties ?? List.unmodifiable(this.sortByProperties),
|
||||||
|
offset: offset ?? this.offset,
|
||||||
|
limit: limit ?? this.limit,
|
||||||
|
propertyName: propertyName ?? this.propertyName,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Query<R> build<R>() {
|
||||||
|
return collection!.buildQuery(
|
||||||
|
whereDistinct: whereDistinct,
|
||||||
|
whereSort: whereSort,
|
||||||
|
whereClauses: whereClauses,
|
||||||
|
filter: filter,
|
||||||
|
sortBy: sortByProperties,
|
||||||
|
distinctBy: distinctByProperties,
|
||||||
|
offset: offset,
|
||||||
|
limit: limit,
|
||||||
|
property: propertyName,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
///
|
||||||
|
/// Right after query starts
|
||||||
|
@protected
|
||||||
|
class QWhere
|
||||||
|
implements
|
||||||
|
QWhereClause,
|
||||||
|
QFilter,
|
||||||
|
QSortBy,
|
||||||
|
QDistinct,
|
||||||
|
QOffset,
|
||||||
|
QLimit,
|
||||||
|
QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
///
|
||||||
|
/// No more where conditions are allowed
|
||||||
|
@protected
|
||||||
|
class QAfterWhere
|
||||||
|
implements QFilter, QSortBy, QDistinct, QOffset, QLimit, QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QWhereClause {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QAfterWhereClause
|
||||||
|
implements
|
||||||
|
QWhereOr,
|
||||||
|
QFilter,
|
||||||
|
QSortBy,
|
||||||
|
QDistinct,
|
||||||
|
QOffset,
|
||||||
|
QLimit,
|
||||||
|
QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QWhereOr {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QFilter {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QFilterCondition {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QAfterFilterCondition
|
||||||
|
implements
|
||||||
|
QFilterCondition,
|
||||||
|
QFilterOperator,
|
||||||
|
QSortBy,
|
||||||
|
QDistinct,
|
||||||
|
QOffset,
|
||||||
|
QLimit,
|
||||||
|
QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QFilterOperator {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QAfterFilterOperator implements QFilterCondition {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QSortBy {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QAfterSortBy
|
||||||
|
implements QSortThenBy, QDistinct, QOffset, QLimit, QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QSortThenBy {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QDistinct implements QOffset, QLimit, QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QOffset {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QAfterOffset implements QLimit, QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QLimit {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QAfterLimit implements QQueryProperty {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QQueryProperty implements QQueryOperations {}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
class QQueryOperations {}
|
303
lib/src/query_builder_extensions.dart
Normal file
303
lib/src/query_builder_extensions.dart
Normal file
@ -0,0 +1,303 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryWhereOr<OBJ, R> on QueryBuilder<OBJ, R, QWhereOr> {
|
||||||
|
/// Union of two where clauses.
|
||||||
|
QueryBuilder<OBJ, R, QWhereClause> or() {
|
||||||
|
return QueryBuilder(_query);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef WhereRepeatModifier<OBJ, R, E> = QueryBuilder<OBJ, R, QAfterWhereClause>
|
||||||
|
Function(QueryBuilder<OBJ, R, QWhereClause> q, E element);
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryWhere<OBJ, R> on QueryBuilder<OBJ, R, QWhereClause> {
|
||||||
|
/// Joins the results of the [modifier] for each item in [items] using logical
|
||||||
|
/// OR. So an object will be included if it matches at least one of the
|
||||||
|
/// resulting where clauses.
|
||||||
|
///
|
||||||
|
/// If [items] is empty, this is a no-op.
|
||||||
|
QueryBuilder<OBJ, R, QAfterWhereClause> anyOf<E, RS>(
|
||||||
|
Iterable<E> items,
|
||||||
|
WhereRepeatModifier<OBJ, R, E> modifier,
|
||||||
|
) {
|
||||||
|
QueryBuilder<OBJ, R, QAfterWhereClause>? q;
|
||||||
|
for (final e in items) {
|
||||||
|
q = modifier(q?.or() ?? QueryBuilder(_query), e);
|
||||||
|
}
|
||||||
|
return q ?? QueryBuilder(_query);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryFilters<OBJ, R> on QueryBuilder<OBJ, R, QFilter> {
|
||||||
|
/// Start using filter conditions.
|
||||||
|
QueryBuilder<OBJ, R, QFilterCondition> filter() {
|
||||||
|
return QueryBuilder(_query);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef FilterRepeatModifier<OBJ, R, E>
|
||||||
|
= QueryBuilder<OBJ, R, QAfterFilterCondition> Function(
|
||||||
|
QueryBuilder<OBJ, R, QFilterCondition> q,
|
||||||
|
E element,
|
||||||
|
);
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryFilterAndOr<OBJ, R> on QueryBuilder<OBJ, R, QFilterOperator> {
|
||||||
|
/// Intersection of two filter conditions.
|
||||||
|
QueryBuilder<OBJ, R, QFilterCondition> and() {
|
||||||
|
return QueryBuilder.apply(
|
||||||
|
this,
|
||||||
|
(q) => q.copyWith(filterGroupType: FilterGroupType.and),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Union of two filter conditions.
|
||||||
|
QueryBuilder<OBJ, R, QFilterCondition> or() {
|
||||||
|
return QueryBuilder.apply(
|
||||||
|
this,
|
||||||
|
(q) => q.copyWith(filterGroupType: FilterGroupType.or),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Logical XOR of two filter conditions.
|
||||||
|
QueryBuilder<OBJ, R, QFilterCondition> xor() {
|
||||||
|
return QueryBuilder.apply(
|
||||||
|
this,
|
||||||
|
(q) => q.copyWith(filterGroupType: FilterGroupType.xor),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryFilterNot<OBJ, R> on QueryBuilder<OBJ, R, QFilterCondition> {
|
||||||
|
/// Complement the next filter condition or group.
|
||||||
|
QueryBuilder<OBJ, R, QFilterCondition> not() {
|
||||||
|
return QueryBuilder.apply(
|
||||||
|
this,
|
||||||
|
(q) => q.copyWith(filterNot: !q.filterNot),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Joins the results of the [modifier] for each item in [items] using logical
|
||||||
|
/// OR. So an object will be included if it matches at least one of the
|
||||||
|
/// resulting filters.
|
||||||
|
///
|
||||||
|
/// If [items] is empty, this is a no-op.
|
||||||
|
QueryBuilder<OBJ, R, QAfterFilterCondition> anyOf<E, RS>(
|
||||||
|
Iterable<E> items,
|
||||||
|
FilterRepeatModifier<OBJ, OBJ, E> modifier,
|
||||||
|
) {
|
||||||
|
return QueryBuilder.apply(this, (query) {
|
||||||
|
return query.group((q) {
|
||||||
|
var q2 = QueryBuilder<OBJ, OBJ, QAfterFilterCondition>(q._query);
|
||||||
|
for (final e in items) {
|
||||||
|
q2 = modifier(q2.or(), e);
|
||||||
|
}
|
||||||
|
return q2;
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Joins the results of the [modifier] for each item in [items] using logical
|
||||||
|
/// AND. So an object will be included if it matches all of the resulting
|
||||||
|
/// filters.
|
||||||
|
///
|
||||||
|
/// If [items] is empty, this is a no-op.
|
||||||
|
QueryBuilder<OBJ, R, QAfterFilterCondition> allOf<E, RS>(
|
||||||
|
Iterable<E> items,
|
||||||
|
FilterRepeatModifier<OBJ, OBJ, E> modifier,
|
||||||
|
) {
|
||||||
|
return QueryBuilder.apply(this, (query) {
|
||||||
|
return query.group((q) {
|
||||||
|
var q2 = QueryBuilder<OBJ, OBJ, QAfterFilterCondition>(q._query);
|
||||||
|
for (final e in items) {
|
||||||
|
q2 = modifier(q2.and(), e);
|
||||||
|
}
|
||||||
|
return q2;
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Joins the results of the [modifier] for each item in [items] using logical
|
||||||
|
/// XOR. So an object will be included if it matches exactly one of the
|
||||||
|
/// resulting filters.
|
||||||
|
///
|
||||||
|
/// If [items] is empty, this is a no-op.
|
||||||
|
QueryBuilder<OBJ, R, QAfterFilterCondition> oneOf<E, RS>(
|
||||||
|
Iterable<E> items,
|
||||||
|
FilterRepeatModifier<OBJ, R, E> modifier,
|
||||||
|
) {
|
||||||
|
QueryBuilder<OBJ, R, QAfterFilterCondition>? q;
|
||||||
|
for (final e in items) {
|
||||||
|
q = modifier(q?.xor() ?? QueryBuilder(_query), e);
|
||||||
|
}
|
||||||
|
return q ?? QueryBuilder(_query);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryFilterNoGroups<OBJ, R>
|
||||||
|
on QueryBuilder<OBJ, R, QFilterCondition> {
|
||||||
|
/// Group filter conditions.
|
||||||
|
QueryBuilder<OBJ, R, QAfterFilterCondition> group(FilterQuery<OBJ> q) {
|
||||||
|
return QueryBuilder.apply(this, (query) => query.group(q));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryOffset<OBJ, R> on QueryBuilder<OBJ, R, QOffset> {
|
||||||
|
/// Offset the query results by a static number.
|
||||||
|
QueryBuilder<OBJ, R, QAfterOffset> offset(int offset) {
|
||||||
|
return QueryBuilder.apply(this, (q) => q.copyWith(offset: offset));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryLimit<OBJ, R> on QueryBuilder<OBJ, R, QLimit> {
|
||||||
|
/// Limit the maximum number of query results.
|
||||||
|
QueryBuilder<OBJ, R, QAfterLimit> limit(int limit) {
|
||||||
|
return QueryBuilder.apply(this, (q) => q.copyWith(limit: limit));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef QueryOption<OBJ, S, RS> = QueryBuilder<OBJ, OBJ, RS> Function(
|
||||||
|
QueryBuilder<OBJ, OBJ, S> q,
|
||||||
|
);
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders.
|
||||||
|
extension QueryModifier<OBJ, S> on QueryBuilder<OBJ, OBJ, S> {
|
||||||
|
/// Only apply a part of the query if `enabled` is true.
|
||||||
|
QueryBuilder<OBJ, OBJ, RS> optional<RS>(
|
||||||
|
bool enabled,
|
||||||
|
QueryOption<OBJ, S, RS> option,
|
||||||
|
) {
|
||||||
|
if (enabled) {
|
||||||
|
return option(this);
|
||||||
|
} else {
|
||||||
|
return QueryBuilder(_query);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders
|
||||||
|
extension QueryExecute<OBJ, R> on QueryBuilder<OBJ, R, QQueryOperations> {
|
||||||
|
/// Create a query from this query builder.
|
||||||
|
Query<R> build() => _query.build();
|
||||||
|
|
||||||
|
/// {@macro query_find_first}
|
||||||
|
Future<R?> findFirst() => build().findFirst();
|
||||||
|
|
||||||
|
/// {@macro query_find_first}
|
||||||
|
R? findFirstSync() => build().findFirstSync();
|
||||||
|
|
||||||
|
/// {@macro query_find_all}
|
||||||
|
Future<List<R>> findAll() => build().findAll();
|
||||||
|
|
||||||
|
/// {@macro query_find_all}
|
||||||
|
List<R> findAllSync() => build().findAllSync();
|
||||||
|
|
||||||
|
/// {@macro query_count}
|
||||||
|
Future<int> count() => build().count();
|
||||||
|
|
||||||
|
/// {@macro query_count}
|
||||||
|
int countSync() => build().countSync();
|
||||||
|
|
||||||
|
/// {@macro query_is_empty}
|
||||||
|
Future<bool> isEmpty() => build().isEmpty();
|
||||||
|
|
||||||
|
/// {@macro query_is_empty}
|
||||||
|
bool isEmptySync() => build().isEmptySync();
|
||||||
|
|
||||||
|
/// {@macro query_is_not_empty}
|
||||||
|
Future<bool> isNotEmpty() => build().isNotEmpty();
|
||||||
|
|
||||||
|
/// {@macro query_is_not_empty}
|
||||||
|
bool isNotEmptySync() => build().isNotEmptySync();
|
||||||
|
|
||||||
|
/// {@macro query_delete_first}
|
||||||
|
Future<bool> deleteFirst() => build().deleteFirst();
|
||||||
|
|
||||||
|
/// {@macro query_delete_first}
|
||||||
|
bool deleteFirstSync() => build().deleteFirstSync();
|
||||||
|
|
||||||
|
/// {@macro query_delete_all}
|
||||||
|
Future<int> deleteAll() => build().deleteAll();
|
||||||
|
|
||||||
|
/// {@macro query_delete_all}
|
||||||
|
int deleteAllSync() => build().deleteAllSync();
|
||||||
|
|
||||||
|
/// {@macro query_watch}
|
||||||
|
Stream<List<R>> watch({bool fireImmediately = false}) =>
|
||||||
|
build().watch(fireImmediately: fireImmediately);
|
||||||
|
|
||||||
|
/// {@macro query_watch_lazy}
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false}) =>
|
||||||
|
build().watchLazy(fireImmediately: fireImmediately);
|
||||||
|
|
||||||
|
/// {@macro query_export_json_raw}
|
||||||
|
Future<T> exportJsonRaw<T>(T Function(Uint8List) callback) =>
|
||||||
|
build().exportJsonRaw(callback);
|
||||||
|
|
||||||
|
/// {@macro query_export_json_raw}
|
||||||
|
T exportJsonRawSync<T>(T Function(Uint8List) callback) =>
|
||||||
|
build().exportJsonRawSync(callback);
|
||||||
|
|
||||||
|
/// {@macro query_export_json}
|
||||||
|
Future<List<Map<String, dynamic>>> exportJson() => build().exportJson();
|
||||||
|
|
||||||
|
/// {@macro query_export_json}
|
||||||
|
List<Map<String, dynamic>> exportJsonSync() => build().exportJsonSync();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders
|
||||||
|
extension QueryExecuteAggregation<OBJ, T extends num>
|
||||||
|
on QueryBuilder<OBJ, T?, QQueryOperations> {
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
Future<T?> min() => build().min();
|
||||||
|
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
T? minSync() => build().minSync();
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
Future<T?> max() => build().max();
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
T? maxSync() => build().maxSync();
|
||||||
|
|
||||||
|
/// {@macro aggregation_average}
|
||||||
|
Future<double> average() => build().average();
|
||||||
|
|
||||||
|
/// {@macro aggregation_average}
|
||||||
|
double averageSync() => build().averageSync();
|
||||||
|
|
||||||
|
/// {@macro aggregation_sum}
|
||||||
|
Future<T> sum() => build().sum();
|
||||||
|
|
||||||
|
/// {@macro aggregation_sum}
|
||||||
|
T sumSync() => build().sumSync();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extension for QueryBuilders
|
||||||
|
extension QueryExecuteDateAggregation<OBJ>
|
||||||
|
on QueryBuilder<OBJ, DateTime?, QQueryOperations> {
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
Future<DateTime?> min() => build().min();
|
||||||
|
|
||||||
|
/// {@macro aggregation_min}
|
||||||
|
DateTime? minSync() => build().minSync();
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
Future<DateTime?> max() => build().max();
|
||||||
|
|
||||||
|
/// {@macro aggregation_max}
|
||||||
|
DateTime? maxSync() => build().maxSync();
|
||||||
|
}
|
597
lib/src/query_components.dart
Normal file
597
lib/src/query_components.dart
Normal file
@ -0,0 +1,597 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// A where clause to traverse an Isar index.
|
||||||
|
abstract class WhereClause {
|
||||||
|
const WhereClause._();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// A where clause traversing the primary index (ids).
|
||||||
|
class IdWhereClause extends WhereClause {
|
||||||
|
/// Where clause that matches all ids. Useful to get sorted results.
|
||||||
|
const IdWhereClause.any()
|
||||||
|
: lower = null,
|
||||||
|
upper = null,
|
||||||
|
includeLower = true,
|
||||||
|
includeUpper = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all id values greater than the given [lower]
|
||||||
|
/// bound.
|
||||||
|
const IdWhereClause.greaterThan({
|
||||||
|
required Id this.lower,
|
||||||
|
this.includeLower = true,
|
||||||
|
}) : upper = null,
|
||||||
|
includeUpper = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all id values less than the given [upper]
|
||||||
|
/// bound.
|
||||||
|
const IdWhereClause.lessThan({
|
||||||
|
required Id this.upper,
|
||||||
|
this.includeUpper = true,
|
||||||
|
}) : lower = null,
|
||||||
|
includeLower = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches the id value equal to the given [value].
|
||||||
|
const IdWhereClause.equalTo({
|
||||||
|
required Id value,
|
||||||
|
}) : lower = value,
|
||||||
|
upper = value,
|
||||||
|
includeLower = true,
|
||||||
|
includeUpper = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all id values between the given [lower] and
|
||||||
|
/// [upper] bounds.
|
||||||
|
const IdWhereClause.between({
|
||||||
|
this.lower,
|
||||||
|
this.includeLower = true,
|
||||||
|
this.upper,
|
||||||
|
this.includeUpper = true,
|
||||||
|
}) : super._();
|
||||||
|
|
||||||
|
/// The lower bound id or `null` for unbounded.
|
||||||
|
final Id? lower;
|
||||||
|
|
||||||
|
/// Whether the lower bound should be included in the results.
|
||||||
|
final bool includeLower;
|
||||||
|
|
||||||
|
/// The upper bound id or `null` for unbounded.
|
||||||
|
final Id? upper;
|
||||||
|
|
||||||
|
/// Whether the upper bound should be included in the results.
|
||||||
|
final bool includeUpper;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// A where clause traversing an index.
|
||||||
|
class IndexWhereClause extends WhereClause {
|
||||||
|
/// Where clause that matches all index values. Useful to get sorted results.
|
||||||
|
const IndexWhereClause.any({required this.indexName})
|
||||||
|
: lower = null,
|
||||||
|
upper = null,
|
||||||
|
includeLower = true,
|
||||||
|
includeUpper = true,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all index values greater than the given [lower]
|
||||||
|
/// bound.
|
||||||
|
///
|
||||||
|
/// For composite indexes, the first elements of the [lower] list are checked
|
||||||
|
/// for equality.
|
||||||
|
const IndexWhereClause.greaterThan({
|
||||||
|
required this.indexName,
|
||||||
|
required IndexKey this.lower,
|
||||||
|
this.includeLower = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : upper = null,
|
||||||
|
includeUpper = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all index values less than the given [upper]
|
||||||
|
/// bound.
|
||||||
|
///
|
||||||
|
/// For composite indexes, the first elements of the [upper] list are checked
|
||||||
|
/// for equality.
|
||||||
|
const IndexWhereClause.lessThan({
|
||||||
|
required this.indexName,
|
||||||
|
required IndexKey this.upper,
|
||||||
|
this.includeUpper = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : lower = null,
|
||||||
|
includeLower = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all index values equal to the given [value].
|
||||||
|
const IndexWhereClause.equalTo({
|
||||||
|
required this.indexName,
|
||||||
|
required IndexKey value,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : lower = value,
|
||||||
|
upper = value,
|
||||||
|
includeLower = true,
|
||||||
|
includeUpper = true,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Where clause that matches all index values between the given [lower] and
|
||||||
|
/// [upper] bounds.
|
||||||
|
///
|
||||||
|
/// For composite indexes, the first elements of the [lower] and [upper] lists
|
||||||
|
/// are checked for equality.
|
||||||
|
const IndexWhereClause.between({
|
||||||
|
required this.indexName,
|
||||||
|
required IndexKey this.lower,
|
||||||
|
this.includeLower = true,
|
||||||
|
required IndexKey this.upper,
|
||||||
|
this.includeUpper = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : super._();
|
||||||
|
|
||||||
|
/// The Isar name of the index to be used.
|
||||||
|
final String indexName;
|
||||||
|
|
||||||
|
/// The lower bound of the where clause.
|
||||||
|
final IndexKey? lower;
|
||||||
|
|
||||||
|
/// Whether the lower bound should be included in the results. Double values
|
||||||
|
/// are never included.
|
||||||
|
final bool includeLower;
|
||||||
|
|
||||||
|
/// The upper bound of the where clause.
|
||||||
|
final IndexKey? upper;
|
||||||
|
|
||||||
|
/// Whether the upper bound should be included in the results. Double values
|
||||||
|
/// are never included.
|
||||||
|
final bool includeUpper;
|
||||||
|
|
||||||
|
/// The precision to use for floating point values.
|
||||||
|
final double epsilon;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// A where clause traversing objects linked to the specified object.
|
||||||
|
class LinkWhereClause extends WhereClause {
|
||||||
|
/// Create a where clause for the specified link.
|
||||||
|
const LinkWhereClause({
|
||||||
|
required this.linkCollection,
|
||||||
|
required this.linkName,
|
||||||
|
required this.id,
|
||||||
|
}) : super._();
|
||||||
|
|
||||||
|
/// The name of the collection the link originates from.
|
||||||
|
final String linkCollection;
|
||||||
|
|
||||||
|
/// The isar name of the link to be used.
|
||||||
|
final String linkName;
|
||||||
|
|
||||||
|
/// The id of the source object.
|
||||||
|
final Id id;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
abstract class FilterOperation {
|
||||||
|
const FilterOperation._();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// The type of dynamic filter conditions.
|
||||||
|
enum FilterConditionType {
|
||||||
|
/// Filter checking for equality.
|
||||||
|
equalTo,
|
||||||
|
|
||||||
|
/// Filter matching values greater than the bound.
|
||||||
|
greaterThan,
|
||||||
|
|
||||||
|
/// Filter matching values smaller than the bound.
|
||||||
|
lessThan,
|
||||||
|
|
||||||
|
/// Filter matching values between the bounds.
|
||||||
|
between,
|
||||||
|
|
||||||
|
/// Filter matching String values starting with the prefix.
|
||||||
|
startsWith,
|
||||||
|
|
||||||
|
/// Filter matching String values ending with the suffix.
|
||||||
|
endsWith,
|
||||||
|
|
||||||
|
/// Filter matching String values containing the String.
|
||||||
|
contains,
|
||||||
|
|
||||||
|
/// Filter matching String values matching the wildcard.
|
||||||
|
matches,
|
||||||
|
|
||||||
|
/// Filter matching values that are `null`.
|
||||||
|
isNull,
|
||||||
|
|
||||||
|
/// Filter matching values that are not `null`.
|
||||||
|
isNotNull,
|
||||||
|
|
||||||
|
/// Filter matching lists that contain `null`.
|
||||||
|
elementIsNull,
|
||||||
|
|
||||||
|
/// Filter matching lists that contain an element that is not `null`.
|
||||||
|
elementIsNotNull,
|
||||||
|
|
||||||
|
/// Filter matching the length of a list.
|
||||||
|
listLength,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Create a filter condition dynamically.
|
||||||
|
class FilterCondition extends FilterOperation {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const FilterCondition({
|
||||||
|
required this.type,
|
||||||
|
required this.property,
|
||||||
|
this.value1,
|
||||||
|
this.value2,
|
||||||
|
required this.include1,
|
||||||
|
required this.include2,
|
||||||
|
required this.caseSensitive,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property equals
|
||||||
|
/// [value].
|
||||||
|
///
|
||||||
|
/// For lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.equalTo({
|
||||||
|
required this.property,
|
||||||
|
required Object? value,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : type = FilterConditionType.equalTo,
|
||||||
|
value1 = value,
|
||||||
|
include1 = true,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property is greater
|
||||||
|
/// than [value].
|
||||||
|
///
|
||||||
|
/// For lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.greaterThan({
|
||||||
|
required this.property,
|
||||||
|
required Object? value,
|
||||||
|
bool include = false,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : type = FilterConditionType.greaterThan,
|
||||||
|
value1 = value,
|
||||||
|
include1 = include,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property is less
|
||||||
|
/// than [value].
|
||||||
|
///
|
||||||
|
/// For lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.lessThan({
|
||||||
|
required this.property,
|
||||||
|
required Object? value,
|
||||||
|
bool include = false,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : type = FilterConditionType.lessThan,
|
||||||
|
value1 = value,
|
||||||
|
include1 = include,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property is
|
||||||
|
/// between [lower] and [upper].
|
||||||
|
///
|
||||||
|
/// For lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.between({
|
||||||
|
required this.property,
|
||||||
|
Object? lower,
|
||||||
|
bool includeLower = true,
|
||||||
|
Object? upper,
|
||||||
|
bool includeUpper = true,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
this.epsilon = Query.epsilon,
|
||||||
|
}) : value1 = lower,
|
||||||
|
include1 = includeLower,
|
||||||
|
value2 = upper,
|
||||||
|
include2 = includeUpper,
|
||||||
|
type = FilterConditionType.between,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property starts
|
||||||
|
/// with [value].
|
||||||
|
///
|
||||||
|
/// For String lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.startsWith({
|
||||||
|
required this.property,
|
||||||
|
required String value,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
}) : type = FilterConditionType.startsWith,
|
||||||
|
value1 = value,
|
||||||
|
include1 = true,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property ends with
|
||||||
|
/// [value].
|
||||||
|
///
|
||||||
|
/// For String lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.endsWith({
|
||||||
|
required this.property,
|
||||||
|
required String value,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
}) : type = FilterConditionType.endsWith,
|
||||||
|
value1 = value,
|
||||||
|
include1 = true,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the String property
|
||||||
|
/// contains [value].
|
||||||
|
///
|
||||||
|
/// For String lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.contains({
|
||||||
|
required this.property,
|
||||||
|
required String value,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
}) : type = FilterConditionType.contains,
|
||||||
|
value1 = value,
|
||||||
|
include1 = true,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property matches
|
||||||
|
/// the [wildcard].
|
||||||
|
///
|
||||||
|
/// For String lists, at least one of the values in the list has to match.
|
||||||
|
const FilterCondition.matches({
|
||||||
|
required this.property,
|
||||||
|
required String wildcard,
|
||||||
|
this.caseSensitive = true,
|
||||||
|
}) : type = FilterConditionType.matches,
|
||||||
|
value1 = wildcard,
|
||||||
|
include1 = true,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property is null.
|
||||||
|
const FilterCondition.isNull({
|
||||||
|
required this.property,
|
||||||
|
}) : type = FilterConditionType.isNull,
|
||||||
|
value1 = null,
|
||||||
|
include1 = false,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
caseSensitive = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the property is not
|
||||||
|
/// null.
|
||||||
|
const FilterCondition.isNotNull({
|
||||||
|
required this.property,
|
||||||
|
}) : type = FilterConditionType.isNotNull,
|
||||||
|
value1 = null,
|
||||||
|
include1 = false,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
caseSensitive = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include lists that contain `null`.
|
||||||
|
const FilterCondition.elementIsNull({
|
||||||
|
required this.property,
|
||||||
|
}) : type = FilterConditionType.elementIsNull,
|
||||||
|
value1 = null,
|
||||||
|
include1 = false,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
caseSensitive = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include lists that do not contain `null`.
|
||||||
|
const FilterCondition.elementIsNotNull({
|
||||||
|
required this.property,
|
||||||
|
}) : type = FilterConditionType.elementIsNotNull,
|
||||||
|
value1 = null,
|
||||||
|
include1 = false,
|
||||||
|
value2 = null,
|
||||||
|
include2 = false,
|
||||||
|
caseSensitive = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Filters the results to only include objects where the length of
|
||||||
|
/// [property] is between [lower] (included) and [upper] (included).
|
||||||
|
///
|
||||||
|
/// Only list properties are supported.
|
||||||
|
const FilterCondition.listLength({
|
||||||
|
required this.property,
|
||||||
|
required int lower,
|
||||||
|
required int upper,
|
||||||
|
}) : type = FilterConditionType.listLength,
|
||||||
|
value1 = lower,
|
||||||
|
include1 = true,
|
||||||
|
value2 = upper,
|
||||||
|
include2 = true,
|
||||||
|
caseSensitive = false,
|
||||||
|
epsilon = Query.epsilon,
|
||||||
|
assert(lower >= 0 && upper >= 0, 'List length must be positive.'),
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Type of the filter condition.
|
||||||
|
final FilterConditionType type;
|
||||||
|
|
||||||
|
/// Property used for comparisons.
|
||||||
|
final String property;
|
||||||
|
|
||||||
|
/// Value used for comparisons. Lower bound for `ConditionType.between`.
|
||||||
|
final Object? value1;
|
||||||
|
|
||||||
|
/// Should `value1` be part of the results.
|
||||||
|
final bool include1;
|
||||||
|
|
||||||
|
/// Upper bound for `ConditionType.between`.
|
||||||
|
final Object? value2;
|
||||||
|
|
||||||
|
/// Should `value1` be part of the results.
|
||||||
|
final bool include2;
|
||||||
|
|
||||||
|
/// Are string operations case sensitive.
|
||||||
|
final bool caseSensitive;
|
||||||
|
|
||||||
|
/// The precision to use for floating point values.
|
||||||
|
final double epsilon;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// The type of filter groups.
|
||||||
|
enum FilterGroupType {
|
||||||
|
/// Logical AND.
|
||||||
|
and,
|
||||||
|
|
||||||
|
/// Logical OR.
|
||||||
|
or,
|
||||||
|
|
||||||
|
/// Logical XOR.
|
||||||
|
xor,
|
||||||
|
|
||||||
|
/// Logical NOT.
|
||||||
|
not,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Group one or more filter conditions.
|
||||||
|
class FilterGroup extends FilterOperation {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
FilterGroup({
|
||||||
|
required this.type,
|
||||||
|
required this.filters,
|
||||||
|
}) : super._();
|
||||||
|
|
||||||
|
/// Create a logical AND filter group.
|
||||||
|
///
|
||||||
|
/// Matches when all [filters] match.
|
||||||
|
const FilterGroup.and(this.filters)
|
||||||
|
: type = FilterGroupType.and,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Create a logical OR filter group.
|
||||||
|
///
|
||||||
|
/// Matches when any of the [filters] matches.
|
||||||
|
const FilterGroup.or(this.filters)
|
||||||
|
: type = FilterGroupType.or,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Create a logical XOR filter group.
|
||||||
|
///
|
||||||
|
/// Matches when exactly one of the [filters] matches.
|
||||||
|
const FilterGroup.xor(this.filters)
|
||||||
|
: type = FilterGroupType.xor,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Negate a filter.
|
||||||
|
///
|
||||||
|
/// Matches when any of the [filter] doesn't matches.
|
||||||
|
FilterGroup.not(FilterOperation filter)
|
||||||
|
: filters = [filter],
|
||||||
|
type = FilterGroupType.not,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Type of this group.
|
||||||
|
final FilterGroupType type;
|
||||||
|
|
||||||
|
/// The filter(s) to be grouped.
|
||||||
|
final List<FilterOperation> filters;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Sort order
|
||||||
|
enum Sort {
|
||||||
|
/// Ascending sort order.
|
||||||
|
asc,
|
||||||
|
|
||||||
|
/// Descending sort order.
|
||||||
|
desc,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Property used to sort query results.
|
||||||
|
class SortProperty {
|
||||||
|
/// Create a sort property.
|
||||||
|
const SortProperty({required this.property, required this.sort});
|
||||||
|
|
||||||
|
/// Isar name of the property used for sorting.
|
||||||
|
final String property;
|
||||||
|
|
||||||
|
/// Sort order.
|
||||||
|
final Sort sort;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Property used to filter duplicate values.
|
||||||
|
class DistinctProperty {
|
||||||
|
/// Create a distinct property.
|
||||||
|
const DistinctProperty({required this.property, this.caseSensitive});
|
||||||
|
|
||||||
|
/// Isar name of the property used for sorting.
|
||||||
|
final String property;
|
||||||
|
|
||||||
|
/// Should Strings be case sensitive?
|
||||||
|
final bool? caseSensitive;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Filter condition based on an embedded object.
|
||||||
|
class ObjectFilter extends FilterOperation {
|
||||||
|
/// Create a filter condition based on an embedded object.
|
||||||
|
const ObjectFilter({
|
||||||
|
required this.property,
|
||||||
|
required this.filter,
|
||||||
|
}) : super._();
|
||||||
|
|
||||||
|
/// Property containing the embedded object(s).
|
||||||
|
final String property;
|
||||||
|
|
||||||
|
/// Filter condition that should be applied
|
||||||
|
final FilterOperation filter;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Filter condition based on a link.
|
||||||
|
class LinkFilter extends FilterOperation {
|
||||||
|
/// Create a filter condition based on a link.
|
||||||
|
const LinkFilter({
|
||||||
|
required this.linkName,
|
||||||
|
required FilterOperation this.filter,
|
||||||
|
}) : lower = null,
|
||||||
|
upper = null,
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Create a filter condition based on the number of linked objects.
|
||||||
|
const LinkFilter.length({
|
||||||
|
required this.linkName,
|
||||||
|
required int this.lower,
|
||||||
|
required int this.upper,
|
||||||
|
}) : filter = null,
|
||||||
|
assert(lower >= 0 && upper >= 0, 'Link length must be positive.'),
|
||||||
|
super._();
|
||||||
|
|
||||||
|
/// Isar name of the link.
|
||||||
|
final String linkName;
|
||||||
|
|
||||||
|
/// Filter condition that should be applied
|
||||||
|
final FilterOperation? filter;
|
||||||
|
|
||||||
|
/// The minumum number of linked objects
|
||||||
|
final int? lower;
|
||||||
|
|
||||||
|
/// The maximum number of linked objects
|
||||||
|
final int? upper;
|
||||||
|
}
|
152
lib/src/schema/collection_schema.dart
Normal file
152
lib/src/schema/collection_schema.dart
Normal file
@ -0,0 +1,152 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// This schema represents a collection.
|
||||||
|
class CollectionSchema<OBJ> extends Schema<OBJ> {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const CollectionSchema({
|
||||||
|
required super.id,
|
||||||
|
required super.name,
|
||||||
|
required super.properties,
|
||||||
|
required super.estimateSize,
|
||||||
|
required super.serialize,
|
||||||
|
required super.deserialize,
|
||||||
|
required super.deserializeProp,
|
||||||
|
required this.idName,
|
||||||
|
required this.indexes,
|
||||||
|
required this.links,
|
||||||
|
required this.embeddedSchemas,
|
||||||
|
required this.getId,
|
||||||
|
required this.getLinks,
|
||||||
|
required this.attach,
|
||||||
|
required this.version,
|
||||||
|
}) : assert(
|
||||||
|
Isar.version == version,
|
||||||
|
'Outdated generated code. Please re-run code '
|
||||||
|
'generation using the latest generator.',
|
||||||
|
);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
factory CollectionSchema.fromJson(Map<String, dynamic> json) {
|
||||||
|
final collection = Schema<dynamic>.fromJson(json);
|
||||||
|
return CollectionSchema(
|
||||||
|
id: collection.id,
|
||||||
|
name: collection.name,
|
||||||
|
properties: collection.properties,
|
||||||
|
idName: json['idName'] as String,
|
||||||
|
indexes: {
|
||||||
|
for (final index in json['indexes'] as List<dynamic>)
|
||||||
|
(index as Map<String, dynamic>)['name'] as String:
|
||||||
|
IndexSchema.fromJson(index),
|
||||||
|
},
|
||||||
|
links: {
|
||||||
|
for (final link in json['links'] as List<dynamic>)
|
||||||
|
(link as Map<String, dynamic>)['name'] as String:
|
||||||
|
LinkSchema.fromJson(link),
|
||||||
|
},
|
||||||
|
embeddedSchemas: {
|
||||||
|
for (final schema in json['embeddedSchemas'] as List<dynamic>)
|
||||||
|
(schema as Map<String, dynamic>)['name'] as String:
|
||||||
|
Schema.fromJson(schema),
|
||||||
|
},
|
||||||
|
estimateSize: (_, __, ___) => throw UnimplementedError(),
|
||||||
|
serialize: (_, __, ___, ____) => throw UnimplementedError(),
|
||||||
|
deserialize: (_, __, ___, ____) => throw UnimplementedError(),
|
||||||
|
deserializeProp: (_, __, ___, ____) => throw UnimplementedError(),
|
||||||
|
getId: (_) => throw UnimplementedError(),
|
||||||
|
getLinks: (_) => throw UnimplementedError(),
|
||||||
|
attach: (_, __, ___) => throw UnimplementedError(),
|
||||||
|
version: Isar.version,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Name of the id property
|
||||||
|
final String idName;
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool get embedded => false;
|
||||||
|
|
||||||
|
/// A map of name -> index pairs
|
||||||
|
final Map<String, IndexSchema> indexes;
|
||||||
|
|
||||||
|
/// A map of name -> link pairs
|
||||||
|
final Map<String, LinkSchema> links;
|
||||||
|
|
||||||
|
/// A map of name -> embedded schema pairs
|
||||||
|
final Map<String, Schema<dynamic>> embeddedSchemas;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final GetId<OBJ> getId;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final GetLinks<OBJ> getLinks;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final Attach<OBJ> attach;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final String version;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
void toCollection(void Function<OBJ>() callback) => callback<OBJ>();
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
IndexSchema index(String indexName) {
|
||||||
|
final index = indexes[indexName];
|
||||||
|
if (index != null) {
|
||||||
|
return index;
|
||||||
|
} else {
|
||||||
|
throw IsarError('Unknown index "$indexName"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
LinkSchema link(String linkName) {
|
||||||
|
final link = links[linkName];
|
||||||
|
if (link != null) {
|
||||||
|
return link;
|
||||||
|
} else {
|
||||||
|
throw IsarError('Unknown link "$linkName"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
@override
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
final json = {
|
||||||
|
...super.toJson(),
|
||||||
|
'idName': idName,
|
||||||
|
'indexes': [
|
||||||
|
for (final index in indexes.values) index.toJson(),
|
||||||
|
],
|
||||||
|
'links': [
|
||||||
|
for (final link in links.values) link.toJson(),
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
assert(() {
|
||||||
|
json['embeddedSchemas'] = [
|
||||||
|
for (final schema in embeddedSchemas.values) schema.toJson(),
|
||||||
|
];
|
||||||
|
return true;
|
||||||
|
}());
|
||||||
|
|
||||||
|
return json;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef GetId<T> = Id Function(T object);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef GetLinks<T> = List<IsarLinkBase<dynamic>> Function(T object);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef Attach<T> = void Function(IsarCollection<T> col, Id id, T object);
|
104
lib/src/schema/index_schema.dart
Normal file
104
lib/src/schema/index_schema.dart
Normal file
@ -0,0 +1,104 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// This schema represents an index.
|
||||||
|
class IndexSchema {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const IndexSchema({
|
||||||
|
required this.id,
|
||||||
|
required this.name,
|
||||||
|
required this.unique,
|
||||||
|
required this.replace,
|
||||||
|
required this.properties,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
factory IndexSchema.fromJson(Map<String, dynamic> json) {
|
||||||
|
return IndexSchema(
|
||||||
|
id: -1,
|
||||||
|
name: json['name'] as String,
|
||||||
|
unique: json['unique'] as bool,
|
||||||
|
replace: json['replace'] as bool,
|
||||||
|
properties: (json['properties'] as List<dynamic>)
|
||||||
|
.map((e) => IndexPropertySchema.fromJson(e as Map<String, dynamic>))
|
||||||
|
.toList(),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Internal id of this index.
|
||||||
|
final int id;
|
||||||
|
|
||||||
|
/// Name of this index.
|
||||||
|
final String name;
|
||||||
|
|
||||||
|
/// Whether duplicates are disallowed in this index.
|
||||||
|
final bool unique;
|
||||||
|
|
||||||
|
/// Whether duplocates will be replaced or throw an error.
|
||||||
|
final bool replace;
|
||||||
|
|
||||||
|
/// Composite properties.
|
||||||
|
final List<IndexPropertySchema> properties;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
final json = {
|
||||||
|
'name': name,
|
||||||
|
'unique': unique,
|
||||||
|
'replace': replace,
|
||||||
|
'properties': [
|
||||||
|
for (final property in properties) property.toJson(),
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
return json;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// This schema represents a composite index property.
|
||||||
|
class IndexPropertySchema {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const IndexPropertySchema({
|
||||||
|
required this.name,
|
||||||
|
required this.type,
|
||||||
|
required this.caseSensitive,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
factory IndexPropertySchema.fromJson(Map<String, dynamic> json) {
|
||||||
|
return IndexPropertySchema(
|
||||||
|
name: json['name'] as String,
|
||||||
|
type: IndexType.values.firstWhere((e) => _typeName[e] == json['type']),
|
||||||
|
caseSensitive: json['caseSensitive'] as bool,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Isar name of the property.
|
||||||
|
final String name;
|
||||||
|
|
||||||
|
/// Type of index.
|
||||||
|
final IndexType type;
|
||||||
|
|
||||||
|
/// Whether String properties should be stored with casing.
|
||||||
|
final bool caseSensitive;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
return {
|
||||||
|
'name': name,
|
||||||
|
'type': _typeName[type],
|
||||||
|
'caseSensitive': caseSensitive,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
static const _typeName = {
|
||||||
|
IndexType.value: 'Value',
|
||||||
|
IndexType.hash: 'Hash',
|
||||||
|
IndexType.hashElements: 'HashElements',
|
||||||
|
};
|
||||||
|
}
|
64
lib/src/schema/link_schema.dart
Normal file
64
lib/src/schema/link_schema.dart
Normal file
@ -0,0 +1,64 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// This schema represents a link to the same or another collection.
|
||||||
|
class LinkSchema {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const LinkSchema({
|
||||||
|
required this.id,
|
||||||
|
required this.name,
|
||||||
|
required this.target,
|
||||||
|
required this.single,
|
||||||
|
this.linkName,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
factory LinkSchema.fromJson(Map<String, dynamic> json) {
|
||||||
|
return LinkSchema(
|
||||||
|
id: -1,
|
||||||
|
name: json['name'] as String,
|
||||||
|
target: json['target'] as String,
|
||||||
|
single: json['single'] as bool,
|
||||||
|
linkName: json['linkName'] as String?,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Internal id of this link.
|
||||||
|
final int id;
|
||||||
|
|
||||||
|
/// Name of this link.
|
||||||
|
final String name;
|
||||||
|
|
||||||
|
/// Isar name of the target collection.
|
||||||
|
final String target;
|
||||||
|
|
||||||
|
/// Whether this is link can only hold a single target object.
|
||||||
|
final bool single;
|
||||||
|
|
||||||
|
/// If this is a backlink, [linkName] is the name of the source link in the
|
||||||
|
/// [target] collection.
|
||||||
|
final String? linkName;
|
||||||
|
|
||||||
|
/// Whether this link is a backlink.
|
||||||
|
bool get isBacklink => linkName != null;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
final json = <String, dynamic>{
|
||||||
|
'name': name,
|
||||||
|
'target': target,
|
||||||
|
'single': single,
|
||||||
|
};
|
||||||
|
|
||||||
|
assert(() {
|
||||||
|
if (linkName != null) {
|
||||||
|
json['linkName'] = linkName;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}());
|
||||||
|
|
||||||
|
return json;
|
||||||
|
}
|
||||||
|
}
|
183
lib/src/schema/property_schema.dart
Normal file
183
lib/src/schema/property_schema.dart
Normal file
@ -0,0 +1,183 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// A single propery of a collection or embedded object.
|
||||||
|
class PropertySchema {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const PropertySchema({
|
||||||
|
required this.id,
|
||||||
|
required this.name,
|
||||||
|
required this.type,
|
||||||
|
this.enumMap,
|
||||||
|
this.target,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
factory PropertySchema.fromJson(Map<String, dynamic> json) {
|
||||||
|
return PropertySchema(
|
||||||
|
id: -1,
|
||||||
|
name: json['name'] as String,
|
||||||
|
type: IsarType.values.firstWhere((e) => e.schemaName == json['type']),
|
||||||
|
enumMap: json['enumMap'] as Map<String, dynamic>?,
|
||||||
|
target: json['target'] as String?,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Internal id of this property.
|
||||||
|
final int id;
|
||||||
|
|
||||||
|
/// Name of the property
|
||||||
|
final String name;
|
||||||
|
|
||||||
|
/// Isar type of the property
|
||||||
|
final IsarType type;
|
||||||
|
|
||||||
|
/// Maps enum names to database values
|
||||||
|
final Map<String, dynamic>? enumMap;
|
||||||
|
|
||||||
|
/// For embedded objects: Name of the target schema
|
||||||
|
final String? target;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
final json = <String, dynamic>{
|
||||||
|
'name': name,
|
||||||
|
'type': type.schemaName,
|
||||||
|
if (target != null) 'target': target,
|
||||||
|
};
|
||||||
|
|
||||||
|
assert(() {
|
||||||
|
if (enumMap != null) {
|
||||||
|
json['enumMap'] = enumMap;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}());
|
||||||
|
|
||||||
|
return json;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Supported Isar types
|
||||||
|
enum IsarType {
|
||||||
|
/// Boolean
|
||||||
|
bool('Bool'),
|
||||||
|
|
||||||
|
/// 8-bit unsigned integer
|
||||||
|
byte('Byte'),
|
||||||
|
|
||||||
|
/// 32-bit singed integer
|
||||||
|
int('Int'),
|
||||||
|
|
||||||
|
/// 32-bit float
|
||||||
|
float('Float'),
|
||||||
|
|
||||||
|
/// 64-bit singed integer
|
||||||
|
long('Long'),
|
||||||
|
|
||||||
|
/// 64-bit float
|
||||||
|
double('Double'),
|
||||||
|
|
||||||
|
/// DateTime
|
||||||
|
dateTime('DateTime'),
|
||||||
|
|
||||||
|
/// String
|
||||||
|
string('String'),
|
||||||
|
|
||||||
|
/// Embedded object
|
||||||
|
object('Object'),
|
||||||
|
|
||||||
|
/// Boolean list
|
||||||
|
boolList('BoolList'),
|
||||||
|
|
||||||
|
/// 8-bit unsigned integer list
|
||||||
|
byteList('ByteList'),
|
||||||
|
|
||||||
|
/// 32-bit singed integer list
|
||||||
|
intList('IntList'),
|
||||||
|
|
||||||
|
/// 32-bit float list
|
||||||
|
floatList('FloatList'),
|
||||||
|
|
||||||
|
/// 64-bit singed integer list
|
||||||
|
longList('LongList'),
|
||||||
|
|
||||||
|
/// 64-bit float list
|
||||||
|
doubleList('DoubleList'),
|
||||||
|
|
||||||
|
/// DateTime list
|
||||||
|
dateTimeList('DateTimeList'),
|
||||||
|
|
||||||
|
/// String list
|
||||||
|
stringList('StringList'),
|
||||||
|
|
||||||
|
/// Embedded object list
|
||||||
|
objectList('ObjectList');
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
const IsarType(this.schemaName);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
final String schemaName;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
extension IsarTypeX on IsarType {
|
||||||
|
/// Whether this type represents a list
|
||||||
|
bool get isList => index >= IsarType.boolList.index;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
IsarType get scalarType {
|
||||||
|
switch (this) {
|
||||||
|
case IsarType.boolList:
|
||||||
|
return IsarType.bool;
|
||||||
|
case IsarType.byteList:
|
||||||
|
return IsarType.byte;
|
||||||
|
case IsarType.intList:
|
||||||
|
return IsarType.int;
|
||||||
|
case IsarType.floatList:
|
||||||
|
return IsarType.float;
|
||||||
|
case IsarType.longList:
|
||||||
|
return IsarType.long;
|
||||||
|
case IsarType.doubleList:
|
||||||
|
return IsarType.double;
|
||||||
|
case IsarType.dateTimeList:
|
||||||
|
return IsarType.dateTime;
|
||||||
|
case IsarType.stringList:
|
||||||
|
return IsarType.string;
|
||||||
|
case IsarType.objectList:
|
||||||
|
return IsarType.object;
|
||||||
|
// ignore: no_default_cases
|
||||||
|
default:
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
IsarType get listType {
|
||||||
|
switch (this) {
|
||||||
|
case IsarType.bool:
|
||||||
|
return IsarType.boolList;
|
||||||
|
case IsarType.byte:
|
||||||
|
return IsarType.byteList;
|
||||||
|
case IsarType.int:
|
||||||
|
return IsarType.intList;
|
||||||
|
case IsarType.float:
|
||||||
|
return IsarType.floatList;
|
||||||
|
case IsarType.long:
|
||||||
|
return IsarType.longList;
|
||||||
|
case IsarType.double:
|
||||||
|
return IsarType.doubleList;
|
||||||
|
case IsarType.dateTime:
|
||||||
|
return IsarType.dateTimeList;
|
||||||
|
case IsarType.string:
|
||||||
|
return IsarType.stringList;
|
||||||
|
case IsarType.object:
|
||||||
|
return IsarType.objectList;
|
||||||
|
// ignore: no_default_cases
|
||||||
|
default:
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
126
lib/src/schema/schema.dart
Normal file
126
lib/src/schema/schema.dart
Normal file
@ -0,0 +1,126 @@
|
|||||||
|
part of isar;
|
||||||
|
|
||||||
|
/// This schema either represents a collection or embedded object.
|
||||||
|
class Schema<OBJ> {
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const Schema({
|
||||||
|
required this.id,
|
||||||
|
required this.name,
|
||||||
|
required this.properties,
|
||||||
|
required this.estimateSize,
|
||||||
|
required this.serialize,
|
||||||
|
required this.deserialize,
|
||||||
|
required this.deserializeProp,
|
||||||
|
});
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
factory Schema.fromJson(Map<String, dynamic> json) {
|
||||||
|
return Schema(
|
||||||
|
id: -1,
|
||||||
|
name: json['name'] as String,
|
||||||
|
properties: {
|
||||||
|
for (final property in json['properties'] as List<dynamic>)
|
||||||
|
(property as Map<String, dynamic>)['name'] as String:
|
||||||
|
PropertySchema.fromJson(property),
|
||||||
|
},
|
||||||
|
estimateSize: (_, __, ___) => throw UnimplementedError(),
|
||||||
|
serialize: (_, __, ___, ____) => throw UnimplementedError(),
|
||||||
|
deserialize: (_, __, ___, ____) => throw UnimplementedError(),
|
||||||
|
deserializeProp: (_, __, ___, ____) => throw UnimplementedError(),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Internal id of this collection or embedded object.
|
||||||
|
final int id;
|
||||||
|
|
||||||
|
/// Name of the collection or embedded object
|
||||||
|
final String name;
|
||||||
|
|
||||||
|
/// Whether this is an embedded object
|
||||||
|
bool get embedded => true;
|
||||||
|
|
||||||
|
/// A map of name -> property pairs
|
||||||
|
final Map<String, PropertySchema> properties;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
final EstimateSize<OBJ> estimateSize;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
final Serialize<OBJ> serialize;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
final Deserialize<OBJ> deserialize;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
final DeserializeProp deserializeProp;
|
||||||
|
|
||||||
|
/// Returns a property by its name or throws an error.
|
||||||
|
@pragma('vm:prefer-inline')
|
||||||
|
PropertySchema property(String propertyName) {
|
||||||
|
final property = properties[propertyName];
|
||||||
|
if (property != null) {
|
||||||
|
return property;
|
||||||
|
} else {
|
||||||
|
throw IsarError('Unknown property "$propertyName"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Map<String, dynamic> toJson() {
|
||||||
|
final json = {
|
||||||
|
'name': name,
|
||||||
|
'embedded': embedded,
|
||||||
|
'properties': [
|
||||||
|
for (final property in properties.values) property.toJson(),
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
return json;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
Type get type => OBJ;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef EstimateSize<T> = int Function(
|
||||||
|
T object,
|
||||||
|
List<int> offsets,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef Serialize<T> = void Function(
|
||||||
|
T object,
|
||||||
|
IsarWriter writer,
|
||||||
|
List<int> offsets,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef Deserialize<T> = T Function(
|
||||||
|
Id id,
|
||||||
|
IsarReader reader,
|
||||||
|
List<int> offsets,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
);
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef DeserializeProp = dynamic Function(
|
||||||
|
IsarReader reader,
|
||||||
|
int propertyId,
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
);
|
188
lib/src/web/bindings.dart
Normal file
188
lib/src/web/bindings.dart
Normal file
@ -0,0 +1,188 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:indexed_db';
|
||||||
|
import 'dart:js';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:js/js.dart';
|
||||||
|
import 'package:js/js_util.dart';
|
||||||
|
|
||||||
|
@JS('JSON.stringify')
|
||||||
|
external String stringify(dynamic value);
|
||||||
|
|
||||||
|
@JS('indexedDB.cmp')
|
||||||
|
external int idbCmp(dynamic value1, dynamic value2);
|
||||||
|
|
||||||
|
@JS('Object.keys')
|
||||||
|
external List<String> objectKeys(dynamic obj);
|
||||||
|
|
||||||
|
Map<String, dynamic> jsMapToDart(Object obj) {
|
||||||
|
final keys = objectKeys(obj);
|
||||||
|
final map = <String, dynamic>{};
|
||||||
|
for (final key in keys) {
|
||||||
|
map[key] = getProperty<dynamic>(obj, key);
|
||||||
|
}
|
||||||
|
return map;
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('Promise')
|
||||||
|
class Promise {}
|
||||||
|
|
||||||
|
extension PromiseX on Promise {
|
||||||
|
Future<T> wait<T>() => promiseToFuture(this);
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('openIsar')
|
||||||
|
external Promise openIsarJs(
|
||||||
|
String name,
|
||||||
|
List<dynamic> schemas,
|
||||||
|
bool relaxedDurability,
|
||||||
|
);
|
||||||
|
|
||||||
|
@JS('IsarTxn')
|
||||||
|
class IsarTxnJs {
|
||||||
|
external Promise commit();
|
||||||
|
|
||||||
|
external void abort();
|
||||||
|
|
||||||
|
external bool get write;
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('IsarInstance')
|
||||||
|
class IsarInstanceJs {
|
||||||
|
external IsarTxnJs beginTxn(bool write);
|
||||||
|
|
||||||
|
external IsarCollectionJs getCollection(String name);
|
||||||
|
|
||||||
|
external Promise close(bool deleteFromDisk);
|
||||||
|
}
|
||||||
|
|
||||||
|
typedef ChangeCallbackJs = void Function();
|
||||||
|
|
||||||
|
typedef ObjectChangeCallbackJs = void Function(Object? object);
|
||||||
|
|
||||||
|
typedef QueryChangeCallbackJs = void Function(List<dynamic> results);
|
||||||
|
|
||||||
|
typedef StopWatchingJs = JsFunction;
|
||||||
|
|
||||||
|
@JS('IsarCollection')
|
||||||
|
class IsarCollectionJs {
|
||||||
|
external IsarLinkJs getLink(String name);
|
||||||
|
|
||||||
|
external Promise getAll(IsarTxnJs txn, List<Id> ids);
|
||||||
|
|
||||||
|
external Promise getAllByIndex(
|
||||||
|
IsarTxnJs txn,
|
||||||
|
String indexName,
|
||||||
|
List<List<dynamic>> values,
|
||||||
|
);
|
||||||
|
|
||||||
|
external Promise putAll(IsarTxnJs txn, List<dynamic> objects);
|
||||||
|
|
||||||
|
external Promise deleteAll(IsarTxnJs txn, List<Id> ids);
|
||||||
|
|
||||||
|
external Promise deleteAllByIndex(
|
||||||
|
IsarTxnJs txn,
|
||||||
|
String indexName,
|
||||||
|
List<dynamic> keys,
|
||||||
|
);
|
||||||
|
|
||||||
|
external Promise clear(IsarTxnJs txn);
|
||||||
|
|
||||||
|
external StopWatchingJs watchLazy(ChangeCallbackJs callback);
|
||||||
|
|
||||||
|
external StopWatchingJs watchObject(Id id, ObjectChangeCallbackJs callback);
|
||||||
|
|
||||||
|
external StopWatchingJs watchQuery(
|
||||||
|
QueryJs query,
|
||||||
|
QueryChangeCallbackJs callback,
|
||||||
|
);
|
||||||
|
|
||||||
|
external StopWatchingJs watchQueryLazy(
|
||||||
|
QueryJs query,
|
||||||
|
ChangeCallbackJs callback,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('IsarLink')
|
||||||
|
class IsarLinkJs {
|
||||||
|
external Promise update(
|
||||||
|
IsarTxnJs txn,
|
||||||
|
bool backlink,
|
||||||
|
Id id,
|
||||||
|
List<Id> addedTargets,
|
||||||
|
List<Id> deletedTargets,
|
||||||
|
);
|
||||||
|
|
||||||
|
external Promise clear(IsarTxnJs txn, Id id, bool backlink);
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('IdWhereClause')
|
||||||
|
@anonymous
|
||||||
|
class IdWhereClauseJs {
|
||||||
|
external KeyRange? range;
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('IndexWhereClause')
|
||||||
|
@anonymous
|
||||||
|
class IndexWhereClauseJs {
|
||||||
|
external String indexName;
|
||||||
|
external KeyRange? range;
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('LinkWhereClause')
|
||||||
|
@anonymous
|
||||||
|
class LinkWhereClauseJs {
|
||||||
|
external String linkCollection;
|
||||||
|
external String linkName;
|
||||||
|
external bool backlink;
|
||||||
|
external Id id;
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('Function')
|
||||||
|
class FilterJs {
|
||||||
|
external FilterJs(String id, String obj, String method);
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('Function')
|
||||||
|
class SortCmpJs {
|
||||||
|
external SortCmpJs(String a, String b, String method);
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('Function')
|
||||||
|
class DistinctValueJs {
|
||||||
|
external DistinctValueJs(String obj, String method);
|
||||||
|
}
|
||||||
|
|
||||||
|
@JS('IsarQuery')
|
||||||
|
class QueryJs {
|
||||||
|
external QueryJs(
|
||||||
|
IsarCollectionJs collection,
|
||||||
|
List<dynamic> whereClauses,
|
||||||
|
bool whereDistinct,
|
||||||
|
bool whereAscending,
|
||||||
|
FilterJs? filter,
|
||||||
|
SortCmpJs? sortCmp,
|
||||||
|
DistinctValueJs? distinctValue,
|
||||||
|
int? offset,
|
||||||
|
int? limit,
|
||||||
|
);
|
||||||
|
|
||||||
|
external Promise findFirst(IsarTxnJs txn);
|
||||||
|
|
||||||
|
external Promise findAll(IsarTxnJs txn);
|
||||||
|
|
||||||
|
external Promise deleteFirst(IsarTxnJs txn);
|
||||||
|
|
||||||
|
external Promise deleteAll(IsarTxnJs txn);
|
||||||
|
|
||||||
|
external Promise min(IsarTxnJs txn, String propertyName);
|
||||||
|
|
||||||
|
external Promise max(IsarTxnJs txn, String propertyName);
|
||||||
|
|
||||||
|
external Promise sum(IsarTxnJs txn, String propertyName);
|
||||||
|
|
||||||
|
external Promise average(IsarTxnJs txn, String propertyName);
|
||||||
|
|
||||||
|
external Promise count(IsarTxnJs txn);
|
||||||
|
}
|
266
lib/src/web/isar_collection_impl.dart
Normal file
266
lib/src/web/isar_collection_impl.dart
Normal file
@ -0,0 +1,266 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs, invalid_use_of_protected_member
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:js';
|
||||||
|
import 'dart:js_util';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/web/bindings.dart';
|
||||||
|
import 'package:isar/src/web/isar_impl.dart';
|
||||||
|
import 'package:isar/src/web/isar_reader_impl.dart';
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
import 'package:isar/src/web/isar_writer_impl.dart';
|
||||||
|
import 'package:isar/src/web/query_build.dart';
|
||||||
|
import 'package:meta/dart2js.dart';
|
||||||
|
|
||||||
|
class IsarCollectionImpl<OBJ> extends IsarCollection<OBJ> {
|
||||||
|
IsarCollectionImpl({
|
||||||
|
required this.isar,
|
||||||
|
required this.native,
|
||||||
|
required this.schema,
|
||||||
|
});
|
||||||
|
|
||||||
|
@override
|
||||||
|
final IsarImpl isar;
|
||||||
|
final IsarCollectionJs native;
|
||||||
|
|
||||||
|
@override
|
||||||
|
final CollectionSchema<OBJ> schema;
|
||||||
|
|
||||||
|
@override
|
||||||
|
String get name => schema.name;
|
||||||
|
|
||||||
|
late final _offsets = isar.offsets[OBJ]!;
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
OBJ deserializeObject(Object object) {
|
||||||
|
final id = getProperty<int>(object, idName);
|
||||||
|
final reader = IsarReaderImpl(object);
|
||||||
|
return schema.deserialize(id, reader, _offsets, isar.offsets);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
List<OBJ?> deserializeObjects(dynamic objects) {
|
||||||
|
final list = objects as List;
|
||||||
|
final results = <OBJ?>[];
|
||||||
|
for (final object in list) {
|
||||||
|
results.add(object is Object ? deserializeObject(object) : null);
|
||||||
|
}
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<OBJ?>> getAll(List<Id> ids) {
|
||||||
|
return isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final objects = await native.getAll(txn, ids).wait<List<Object?>>();
|
||||||
|
return deserializeObjects(objects);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<OBJ?>> getAllByIndex(String indexName, List<IndexKey> keys) {
|
||||||
|
return isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final objects = await native
|
||||||
|
.getAllByIndex(txn, indexName, keys)
|
||||||
|
.wait<List<Object?>>();
|
||||||
|
return deserializeObjects(objects);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<OBJ?> getAllSync(List<Id> ids) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<OBJ?> getAllByIndexSync(String indexName, List<IndexKey> keys) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<Id>> putAll(List<OBJ> objects) {
|
||||||
|
return putAllByIndex(null, objects);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<int> putAllSync(List<OBJ> objects, {bool saveLinks = true}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<Id>> putAllByIndex(String? indexName, List<OBJ> objects) {
|
||||||
|
return isar.getTxn(true, (IsarTxnJs txn) async {
|
||||||
|
final serialized = <Object>[];
|
||||||
|
for (final object in objects) {
|
||||||
|
final jsObj = newObject<Object>();
|
||||||
|
final writer = IsarWriterImpl(jsObj);
|
||||||
|
schema.serialize(object, writer, _offsets, isar.offsets);
|
||||||
|
setProperty(jsObj, idName, schema.getId(object));
|
||||||
|
serialized.add(jsObj);
|
||||||
|
}
|
||||||
|
final ids = await native.putAll(txn, serialized).wait<List<dynamic>>();
|
||||||
|
for (var i = 0; i < objects.length; i++) {
|
||||||
|
final object = objects[i];
|
||||||
|
final id = ids[i] as Id;
|
||||||
|
schema.attach(this, id, object);
|
||||||
|
}
|
||||||
|
|
||||||
|
return ids.cast<Id>().toList();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<Id> putAllByIndexSync(
|
||||||
|
String indexName,
|
||||||
|
List<OBJ> objects, {
|
||||||
|
bool saveLinks = true,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> deleteAll(List<Id> ids) async {
|
||||||
|
await isar.getTxn(true, (IsarTxnJs txn) {
|
||||||
|
return native.deleteAll(txn, ids).wait<void>();
|
||||||
|
});
|
||||||
|
return ids.length;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> deleteAllByIndex(String indexName, List<IndexKey> keys) {
|
||||||
|
return isar.getTxn(true, (IsarTxnJs txn) {
|
||||||
|
return native.deleteAllByIndex(txn, indexName, keys).wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int deleteAllSync(List<Id> ids) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
int deleteAllByIndexSync(String indexName, List<IndexKey> keys) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> clear() {
|
||||||
|
return isar.getTxn(true, (IsarTxnJs txn) {
|
||||||
|
return native.clear(txn).wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void clearSync() => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> importJson(List<Map<String, dynamic>> json) {
|
||||||
|
return isar.getTxn(true, (IsarTxnJs txn) async {
|
||||||
|
await native.putAll(txn, json.map(jsify).toList()).wait<dynamic>();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> importJsonRaw(Uint8List jsonBytes) {
|
||||||
|
final json = jsonDecode(const Utf8Decoder().convert(jsonBytes)) as List;
|
||||||
|
return importJson(json.cast());
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void importJsonSync(List<Map<String, dynamic>> json) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
void importJsonRawSync(Uint8List jsonBytes) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> count() => where().count();
|
||||||
|
|
||||||
|
@override
|
||||||
|
int countSync() => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> getSize({
|
||||||
|
bool includeIndexes = false,
|
||||||
|
bool includeLinks = false,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
int getSizeSync({
|
||||||
|
bool includeIndexes = false,
|
||||||
|
bool includeLinks = false,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false}) {
|
||||||
|
JsFunction? stop;
|
||||||
|
final controller = StreamController<void>(
|
||||||
|
onCancel: () {
|
||||||
|
stop?.apply([]);
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
final void Function() callback = allowInterop(() => controller.add(null));
|
||||||
|
stop = native.watchLazy(callback);
|
||||||
|
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<OBJ?> watchObject(
|
||||||
|
Id id, {
|
||||||
|
bool fireImmediately = false,
|
||||||
|
bool deserialize = true,
|
||||||
|
}) {
|
||||||
|
JsFunction? stop;
|
||||||
|
final controller = StreamController<OBJ?>(
|
||||||
|
onCancel: () {
|
||||||
|
stop?.apply([]);
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
final Null Function(Object? obj) callback = allowInterop((Object? obj) {
|
||||||
|
final object = deserialize && obj != null ? deserializeObject(obj) : null;
|
||||||
|
controller.add(object);
|
||||||
|
});
|
||||||
|
stop = native.watchObject(id, callback);
|
||||||
|
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<void> watchObjectLazy(Id id, {bool fireImmediately = false}) =>
|
||||||
|
watchObject(id, deserialize: false);
|
||||||
|
|
||||||
|
@override
|
||||||
|
Query<T> buildQuery<T>({
|
||||||
|
List<WhereClause> whereClauses = const [],
|
||||||
|
bool whereDistinct = false,
|
||||||
|
Sort whereSort = Sort.asc,
|
||||||
|
FilterOperation? filter,
|
||||||
|
List<SortProperty> sortBy = const [],
|
||||||
|
List<DistinctProperty> distinctBy = const [],
|
||||||
|
int? offset,
|
||||||
|
int? limit,
|
||||||
|
String? property,
|
||||||
|
}) {
|
||||||
|
return buildWebQuery(
|
||||||
|
this,
|
||||||
|
whereClauses,
|
||||||
|
whereDistinct,
|
||||||
|
whereSort,
|
||||||
|
filter,
|
||||||
|
sortBy,
|
||||||
|
distinctBy,
|
||||||
|
offset,
|
||||||
|
limit,
|
||||||
|
property,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> verify(List<OBJ> objects) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> verifyLink(
|
||||||
|
String linkName,
|
||||||
|
List<int> sourceIds,
|
||||||
|
List<int> targetIds,
|
||||||
|
) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
}
|
135
lib/src/web/isar_impl.dart
Normal file
135
lib/src/web/isar_impl.dart
Normal file
@ -0,0 +1,135 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:html';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
import 'package:isar/src/web/bindings.dart';
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
|
||||||
|
const Symbol _zoneTxn = #zoneTxn;
|
||||||
|
|
||||||
|
class IsarImpl extends Isar {
|
||||||
|
IsarImpl(super.name, this.instance);
|
||||||
|
|
||||||
|
final IsarInstanceJs instance;
|
||||||
|
final offsets = <Type, List<int>>{};
|
||||||
|
final List<Future<void>> _activeAsyncTxns = [];
|
||||||
|
|
||||||
|
@override
|
||||||
|
final String? directory = null;
|
||||||
|
|
||||||
|
void requireNotInTxn() {
|
||||||
|
if (Zone.current[_zoneTxn] != null) {
|
||||||
|
throw IsarError(
|
||||||
|
'Cannot perform this operation from within an active transaction.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<T> _txn<T>(
|
||||||
|
bool write,
|
||||||
|
bool silent,
|
||||||
|
Future<T> Function() callback,
|
||||||
|
) async {
|
||||||
|
requireOpen();
|
||||||
|
requireNotInTxn();
|
||||||
|
|
||||||
|
final completer = Completer<void>();
|
||||||
|
_activeAsyncTxns.add(completer.future);
|
||||||
|
|
||||||
|
final txn = instance.beginTxn(write);
|
||||||
|
|
||||||
|
final zone = Zone.current.fork(
|
||||||
|
zoneValues: {_zoneTxn: txn},
|
||||||
|
);
|
||||||
|
|
||||||
|
T result;
|
||||||
|
try {
|
||||||
|
result = await zone.run(callback);
|
||||||
|
await txn.commit().wait<dynamic>();
|
||||||
|
} catch (e) {
|
||||||
|
txn.abort();
|
||||||
|
if (e is DomException) {
|
||||||
|
if (e.name == DomException.CONSTRAINT) {
|
||||||
|
throw IsarUniqueViolationError();
|
||||||
|
} else {
|
||||||
|
throw IsarError('${e.name}: ${e.message}');
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
rethrow;
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
completer.complete();
|
||||||
|
_activeAsyncTxns.remove(completer.future);
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<T> txn<T>(Future<T> Function() callback) {
|
||||||
|
return _txn(false, false, callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<T> writeTxn<T>(Future<T> Function() callback, {bool silent = false}) {
|
||||||
|
return _txn(true, silent, callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
T txnSync<T>(T Function() callback) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
T writeTxnSync<T>(T Function() callback, {bool silent = false}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
Future<T> getTxn<T>(bool write, Future<T> Function(IsarTxnJs txn) callback) {
|
||||||
|
final currentTxn = Zone.current[_zoneTxn] as IsarTxnJs?;
|
||||||
|
if (currentTxn != null) {
|
||||||
|
if (write && !currentTxn.write) {
|
||||||
|
throw IsarError(
|
||||||
|
'Operation cannot be performed within a read transaction.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return callback(currentTxn);
|
||||||
|
} else if (!write) {
|
||||||
|
return _txn(false, false, () {
|
||||||
|
return callback(Zone.current[_zoneTxn] as IsarTxnJs);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
throw IsarError('Write operations require an explicit transaction.');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> getSize({
|
||||||
|
bool includeIndexes = false,
|
||||||
|
bool includeLinks = false,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
int getSizeSync({
|
||||||
|
bool includeIndexes = false,
|
||||||
|
bool includeLinks = false,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> copyToFile(String targetPath) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<bool> close({bool deleteFromDisk = false}) async {
|
||||||
|
requireOpen();
|
||||||
|
requireNotInTxn();
|
||||||
|
await Future.wait(_activeAsyncTxns);
|
||||||
|
await super.close();
|
||||||
|
await instance.close(deleteFromDisk).wait<dynamic>();
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> verify() => unsupportedOnWeb();
|
||||||
|
}
|
75
lib/src/web/isar_link_impl.dart
Normal file
75
lib/src/web/isar_link_impl.dart
Normal file
@ -0,0 +1,75 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/common/isar_link_base_impl.dart';
|
||||||
|
import 'package:isar/src/common/isar_link_common.dart';
|
||||||
|
import 'package:isar/src/common/isar_links_common.dart';
|
||||||
|
import 'package:isar/src/web/bindings.dart';
|
||||||
|
import 'package:isar/src/web/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
|
||||||
|
mixin IsarLinkBaseMixin<OBJ> on IsarLinkBaseImpl<OBJ> {
|
||||||
|
@override
|
||||||
|
IsarCollectionImpl<dynamic> get sourceCollection =>
|
||||||
|
super.sourceCollection as IsarCollectionImpl;
|
||||||
|
|
||||||
|
@override
|
||||||
|
IsarCollectionImpl<OBJ> get targetCollection =>
|
||||||
|
super.targetCollection as IsarCollectionImpl<OBJ>;
|
||||||
|
|
||||||
|
@override
|
||||||
|
late final Id Function(OBJ) getId = targetCollection.schema.getId;
|
||||||
|
|
||||||
|
late final String? backlinkLinkName =
|
||||||
|
sourceCollection.schema.link(linkName).linkName;
|
||||||
|
|
||||||
|
late final IsarLinkJs jsLink = backlinkLinkName != null
|
||||||
|
? targetCollection.native.getLink(backlinkLinkName!)
|
||||||
|
: sourceCollection.native.getLink(linkName);
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<void> update({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
}) {
|
||||||
|
final linkList = link.toList();
|
||||||
|
final unlinkList = unlink.toList();
|
||||||
|
|
||||||
|
final containingId = requireAttached();
|
||||||
|
final backlink = backlinkLinkName != null;
|
||||||
|
|
||||||
|
final linkIds = List<Id>.filled(linkList.length, 0);
|
||||||
|
for (var i = 0; i < linkList.length; i++) {
|
||||||
|
linkIds[i] = requireGetId(linkList[i]);
|
||||||
|
}
|
||||||
|
|
||||||
|
final unlinkIds = List<Id>.filled(unlinkList.length, 0);
|
||||||
|
for (var i = 0; i < unlinkList.length; i++) {
|
||||||
|
unlinkIds[i] = requireGetId(unlinkList[i]);
|
||||||
|
}
|
||||||
|
|
||||||
|
return targetCollection.isar.getTxn(true, (IsarTxnJs txn) async {
|
||||||
|
if (reset) {
|
||||||
|
await jsLink.clear(txn, containingId, backlink).wait<dynamic>();
|
||||||
|
}
|
||||||
|
return jsLink
|
||||||
|
.update(txn, backlink, containingId, linkIds, unlinkIds)
|
||||||
|
.wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
void updateSync({
|
||||||
|
Iterable<OBJ> link = const [],
|
||||||
|
Iterable<OBJ> unlink = const [],
|
||||||
|
bool reset = false,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
}
|
||||||
|
|
||||||
|
class IsarLinkImpl<OBJ> extends IsarLinkCommon<OBJ>
|
||||||
|
with IsarLinkBaseMixin<OBJ> {}
|
||||||
|
|
||||||
|
class IsarLinksImpl<OBJ> extends IsarLinksCommon<OBJ>
|
||||||
|
with IsarLinkBaseMixin<OBJ> {}
|
347
lib/src/web/isar_reader_impl.dart
Normal file
347
lib/src/web/isar_reader_impl.dart
Normal file
@ -0,0 +1,347 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:js/js_util.dart';
|
||||||
|
import 'package:meta/dart2js.dart';
|
||||||
|
|
||||||
|
const nullNumber = double.negativeInfinity;
|
||||||
|
const idName = '_id';
|
||||||
|
final nullDate = DateTime.fromMillisecondsSinceEpoch(0);
|
||||||
|
|
||||||
|
class IsarReaderImpl implements IsarReader {
|
||||||
|
IsarReaderImpl(this.object);
|
||||||
|
|
||||||
|
final Object object;
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
bool readBool(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value == 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
bool? readBoolOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value == 0
|
||||||
|
? false
|
||||||
|
: value == 1
|
||||||
|
? true
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
int readByte(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int ? value : nullNumber as int;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
int? readByteOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int && value != nullNumber ? value : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
int readInt(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int ? value : nullNumber as int;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
int? readIntOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int && value != nullNumber ? value : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
double readFloat(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is double ? value : nullNumber;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
double? readFloatOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is double && value != nullNumber ? value : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
int readLong(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int ? value : nullNumber as int;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
int? readLongOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int && value != nullNumber ? value : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
double readDouble(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is double && value != nullNumber ? value : nullNumber;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
double? readDoubleOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is double && value != nullNumber ? value : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
DateTime readDateTime(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int && value != nullNumber
|
||||||
|
? DateTime.fromMillisecondsSinceEpoch(value, isUtc: true).toLocal()
|
||||||
|
: nullDate;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
DateTime? readDateTimeOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is int && value != nullNumber
|
||||||
|
? DateTime.fromMillisecondsSinceEpoch(value, isUtc: true).toLocal()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
String readString(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is String ? value : '';
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
String? readStringOrNull(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is String ? value : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
T? readObjectOrNull<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
if (value is Object) {
|
||||||
|
final reader = IsarReaderImpl(value);
|
||||||
|
return deserialize(0, reader, allOffsets[T]!, allOffsets);
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<bool>? readBoolList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List ? value.map((e) => e == 1).toList() : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<bool?>? readBoolOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value
|
||||||
|
.map(
|
||||||
|
(e) => e == 0
|
||||||
|
? false
|
||||||
|
: e == 1
|
||||||
|
? true
|
||||||
|
: null,
|
||||||
|
)
|
||||||
|
.toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<int>? readByteList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is int ? e : nullNumber as int).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<int>? readIntList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is int ? e : nullNumber as int).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<int?>? readIntOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is int && e != nullNumber ? e : null).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<double>? readFloatList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is double ? e : nullNumber).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<double?>? readFloatOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is double && e != nullNumber ? e : null).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<int>? readLongList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is int ? e : nullNumber as int).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<int?>? readLongOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is int && e != nullNumber ? e : null).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<double>? readDoubleList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is double ? e : nullNumber).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<double?>? readDoubleOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is double && e != nullNumber ? e : null).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<DateTime>? readDateTimeList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value
|
||||||
|
.map(
|
||||||
|
(e) => e is int && e != nullNumber
|
||||||
|
? DateTime.fromMillisecondsSinceEpoch(e, isUtc: true)
|
||||||
|
.toLocal()
|
||||||
|
: nullDate,
|
||||||
|
)
|
||||||
|
.toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<DateTime?>? readDateTimeOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value
|
||||||
|
.map(
|
||||||
|
(e) => e is int && e != nullNumber
|
||||||
|
? DateTime.fromMillisecondsSinceEpoch(e, isUtc: true)
|
||||||
|
.toLocal()
|
||||||
|
: null,
|
||||||
|
)
|
||||||
|
.toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<String>? readStringList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is String ? e : '').toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<String?>? readStringOrNullList(int offset) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) => e is String ? e : null).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<T>? readObjectList<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
T defaultValue,
|
||||||
|
) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) {
|
||||||
|
if (e is Object) {
|
||||||
|
final reader = IsarReaderImpl(e);
|
||||||
|
return deserialize(0, reader, allOffsets[T]!, allOffsets);
|
||||||
|
} else {
|
||||||
|
return defaultValue;
|
||||||
|
}
|
||||||
|
}).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
List<T?>? readObjectOrNullList<T>(
|
||||||
|
int offset,
|
||||||
|
Deserialize<T> deserialize,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
) {
|
||||||
|
final value = getProperty<dynamic>(object, offset);
|
||||||
|
return value is List
|
||||||
|
? value.map((e) {
|
||||||
|
if (e is Object) {
|
||||||
|
final reader = IsarReaderImpl(e);
|
||||||
|
return deserialize(0, reader, allOffsets[T]!, allOffsets);
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}).toList()
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
}
|
48
lib/src/web/isar_web.dart
Normal file
48
lib/src/web/isar_web.dart
Normal file
@ -0,0 +1,48 @@
|
|||||||
|
// ignore_for_file: unused_field, public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:meta/meta.dart';
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const Id isarMinId = -9007199254740990;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const Id isarMaxId = 9007199254740991;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
const Id isarAutoIncrementId = -9007199254740991;
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
Never unsupportedOnWeb() {
|
||||||
|
throw UnsupportedError('This operation is not supported for Isar web');
|
||||||
|
}
|
||||||
|
|
||||||
|
class _WebAbi {
|
||||||
|
static const androidArm = null as dynamic;
|
||||||
|
static const androidArm64 = null as dynamic;
|
||||||
|
static const androidIA32 = null as dynamic;
|
||||||
|
static const androidX64 = null as dynamic;
|
||||||
|
static const iosArm64 = null as dynamic;
|
||||||
|
static const iosX64 = null as dynamic;
|
||||||
|
static const linuxArm64 = null as dynamic;
|
||||||
|
static const linuxX64 = null as dynamic;
|
||||||
|
static const macosArm64 = null as dynamic;
|
||||||
|
static const macosX64 = null as dynamic;
|
||||||
|
static const windowsArm64 = null as dynamic;
|
||||||
|
static const windowsX64 = null as dynamic;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// @nodoc
|
||||||
|
@protected
|
||||||
|
typedef IsarAbi = _WebAbi;
|
||||||
|
|
||||||
|
FutureOr<void> initializeCoreBinary({
|
||||||
|
Map<IsarAbi, String> libraries = const {},
|
||||||
|
bool download = false,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
171
lib/src/web/isar_writer_impl.dart
Normal file
171
lib/src/web/isar_writer_impl.dart
Normal file
@ -0,0 +1,171 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/web/isar_reader_impl.dart';
|
||||||
|
import 'package:js/js_util.dart';
|
||||||
|
import 'package:meta/dart2js.dart';
|
||||||
|
|
||||||
|
class IsarWriterImpl implements IsarWriter {
|
||||||
|
IsarWriterImpl(this.object);
|
||||||
|
|
||||||
|
final Object object;
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeBool(int offset, bool? value) {
|
||||||
|
final number = value == true
|
||||||
|
? 1
|
||||||
|
: value == false
|
||||||
|
? 0
|
||||||
|
: nullNumber;
|
||||||
|
setProperty(object, offset, number);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeByte(int offset, int value) {
|
||||||
|
setProperty(object, offset, value);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeInt(int offset, int? value) {
|
||||||
|
setProperty(object, offset, value ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeFloat(int offset, double? value) {
|
||||||
|
setProperty(object, offset, value ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeLong(int offset, int? value) {
|
||||||
|
setProperty(object, offset, value ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeDouble(int offset, double? value) {
|
||||||
|
setProperty(object, offset, value ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeDateTime(int offset, DateTime? value) {
|
||||||
|
setProperty(
|
||||||
|
object,
|
||||||
|
offset,
|
||||||
|
value?.toUtc().millisecondsSinceEpoch ?? nullNumber,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeString(int offset, String? value) {
|
||||||
|
setProperty(object, offset, value ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeObject<T>(
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
Serialize<T> serialize,
|
||||||
|
T? value,
|
||||||
|
) {
|
||||||
|
if (value != null) {
|
||||||
|
final object = newObject<Object>();
|
||||||
|
final writer = IsarWriterImpl(object);
|
||||||
|
serialize(value, writer, allOffsets[T]!, allOffsets);
|
||||||
|
setProperty(this.object, offset, object);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeByteList(int offset, List<int>? values) {
|
||||||
|
setProperty(object, offset, values ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeBoolList(int offset, List<bool?>? values) {
|
||||||
|
final list = values
|
||||||
|
?.map(
|
||||||
|
(e) => e == false
|
||||||
|
? 0
|
||||||
|
: e == true
|
||||||
|
? 1
|
||||||
|
: nullNumber,
|
||||||
|
)
|
||||||
|
.toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeIntList(int offset, List<int?>? values) {
|
||||||
|
final list = values?.map((e) => e ?? nullNumber).toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeFloatList(int offset, List<double?>? values) {
|
||||||
|
final list = values?.map((e) => e ?? nullNumber).toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeLongList(int offset, List<int?>? values) {
|
||||||
|
final list = values?.map((e) => e ?? nullNumber).toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeDoubleList(int offset, List<double?>? values) {
|
||||||
|
final list = values?.map((e) => e ?? nullNumber).toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeDateTimeList(int offset, List<DateTime?>? values) {
|
||||||
|
final list = values
|
||||||
|
?.map((e) => e?.toUtc().millisecondsSinceEpoch ?? nullNumber)
|
||||||
|
.toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeStringList(int offset, List<String?>? values) {
|
||||||
|
final list = values?.map((e) => e ?? nullNumber).toList();
|
||||||
|
setProperty(object, offset, list ?? nullNumber);
|
||||||
|
}
|
||||||
|
|
||||||
|
@tryInline
|
||||||
|
@override
|
||||||
|
void writeObjectList<T>(
|
||||||
|
int offset,
|
||||||
|
Map<Type, List<int>> allOffsets,
|
||||||
|
Serialize<T> serialize,
|
||||||
|
List<T?>? values,
|
||||||
|
) {
|
||||||
|
if (values != null) {
|
||||||
|
final list = values.map((e) {
|
||||||
|
if (e != null) {
|
||||||
|
final object = newObject<Object>();
|
||||||
|
final writer = IsarWriterImpl(object);
|
||||||
|
serialize(e, writer, allOffsets[T]!, allOffsets);
|
||||||
|
return object;
|
||||||
|
}
|
||||||
|
}).toList();
|
||||||
|
setProperty(object, offset, list);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
82
lib/src/web/open.dart
Normal file
82
lib/src/web/open.dart
Normal file
@ -0,0 +1,82 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs, invalid_use_of_protected_member
|
||||||
|
|
||||||
|
import 'dart:html';
|
||||||
|
//import 'dart:js_util';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
/*import 'package:isar/src/common/schemas.dart';
|
||||||
|
|
||||||
|
import 'package:isar/src/web/bindings.dart';
|
||||||
|
import 'package:isar/src/web/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/web/isar_impl.dart';*/
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
import 'package:meta/meta.dart';
|
||||||
|
|
||||||
|
bool _loaded = false;
|
||||||
|
Future<void> initializeIsarWeb([String? jsUrl]) async {
|
||||||
|
if (_loaded) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
_loaded = true;
|
||||||
|
|
||||||
|
final script = ScriptElement();
|
||||||
|
script.type = 'text/javascript';
|
||||||
|
// ignore: unsafe_html
|
||||||
|
script.src = 'https://unpkg.com/isar@${Isar.version}/dist/index.js';
|
||||||
|
script.async = true;
|
||||||
|
document.head!.append(script);
|
||||||
|
await script.onLoad.first.timeout(
|
||||||
|
const Duration(seconds: 30),
|
||||||
|
onTimeout: () {
|
||||||
|
throw IsarError('Failed to load Isar');
|
||||||
|
},
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@visibleForTesting
|
||||||
|
void doNotInitializeIsarWeb() {
|
||||||
|
_loaded = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
Future<Isar> openIsar({
|
||||||
|
required List<CollectionSchema<dynamic>> schemas,
|
||||||
|
String? directory,
|
||||||
|
required String name,
|
||||||
|
required int maxSizeMiB,
|
||||||
|
required bool relaxedDurability,
|
||||||
|
CompactCondition? compactOnLaunch,
|
||||||
|
}) async {
|
||||||
|
throw IsarError('Please use Isar 2.5.0 if you need web support. '
|
||||||
|
'A 3.x version with web support will be released soon.');
|
||||||
|
/*await initializeIsarWeb();
|
||||||
|
final schemasJson = getSchemas(schemas).map((e) => e.toJson());
|
||||||
|
final schemasJs = jsify(schemasJson.toList()) as List<dynamic>;
|
||||||
|
final instance = await openIsarJs(name, schemasJs, relaxedDurability)
|
||||||
|
.wait<IsarInstanceJs>();
|
||||||
|
final isar = IsarImpl(name, instance);
|
||||||
|
final cols = <Type, IsarCollection<dynamic>>{};
|
||||||
|
for (final schema in schemas) {
|
||||||
|
final col = instance.getCollection(schema.name);
|
||||||
|
schema.toCollection(<OBJ>() {
|
||||||
|
schema as CollectionSchema<OBJ>;
|
||||||
|
cols[OBJ] = IsarCollectionImpl<OBJ>(
|
||||||
|
isar: isar,
|
||||||
|
native: col,
|
||||||
|
schema: schema,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
isar.attachCollections(cols);
|
||||||
|
return isar;*/
|
||||||
|
}
|
||||||
|
|
||||||
|
Isar openIsarSync({
|
||||||
|
required List<CollectionSchema<dynamic>> schemas,
|
||||||
|
String? directory,
|
||||||
|
required String name,
|
||||||
|
required int maxSizeMiB,
|
||||||
|
required bool relaxedDurability,
|
||||||
|
CompactCondition? compactOnLaunch,
|
||||||
|
}) =>
|
||||||
|
unsupportedOnWeb();
|
375
lib/src/web/query_build.dart
Normal file
375
lib/src/web/query_build.dart
Normal file
@ -0,0 +1,375 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs, invalid_use_of_protected_member
|
||||||
|
|
||||||
|
import 'dart:indexed_db';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
import 'package:isar/src/web/bindings.dart';
|
||||||
|
import 'package:isar/src/web/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
import 'package:isar/src/web/query_impl.dart';
|
||||||
|
|
||||||
|
Query<T> buildWebQuery<T, OBJ>(
|
||||||
|
IsarCollectionImpl<OBJ> col,
|
||||||
|
List<WhereClause> whereClauses,
|
||||||
|
bool whereDistinct,
|
||||||
|
Sort whereSort,
|
||||||
|
FilterOperation? filter,
|
||||||
|
List<SortProperty> sortBy,
|
||||||
|
List<DistinctProperty> distinctBy,
|
||||||
|
int? offset,
|
||||||
|
int? limit,
|
||||||
|
String? property,
|
||||||
|
) {
|
||||||
|
final whereClausesJs = whereClauses.map((wc) {
|
||||||
|
if (wc is IdWhereClause) {
|
||||||
|
return _buildIdWhereClause(wc);
|
||||||
|
} else if (wc is IndexWhereClause) {
|
||||||
|
return _buildIndexWhereClause(col.schema, wc);
|
||||||
|
} else {
|
||||||
|
return _buildLinkWhereClause(col, wc as LinkWhereClause);
|
||||||
|
}
|
||||||
|
}).toList();
|
||||||
|
|
||||||
|
final filterJs = filter != null ? _buildFilter(col.schema, filter) : null;
|
||||||
|
final sortJs = sortBy.isNotEmpty ? _buildSort(sortBy) : null;
|
||||||
|
final distinctJs = distinctBy.isNotEmpty ? _buildDistinct(distinctBy) : null;
|
||||||
|
|
||||||
|
final queryJs = QueryJs(
|
||||||
|
col.native,
|
||||||
|
whereClausesJs,
|
||||||
|
whereDistinct,
|
||||||
|
whereSort == Sort.asc,
|
||||||
|
filterJs,
|
||||||
|
sortJs,
|
||||||
|
distinctJs,
|
||||||
|
offset,
|
||||||
|
limit,
|
||||||
|
);
|
||||||
|
|
||||||
|
QueryDeserialize<T> deserialize;
|
||||||
|
//if (property == null) {
|
||||||
|
deserialize = col.deserializeObject as T Function(Object);
|
||||||
|
/*} else {
|
||||||
|
deserialize = (jsObj) => col.schema.deserializeProp(jsObj, property) as T;
|
||||||
|
}*/
|
||||||
|
|
||||||
|
return QueryImpl<T>(col, queryJs, deserialize, property);
|
||||||
|
}
|
||||||
|
|
||||||
|
dynamic _valueToJs(dynamic value) {
|
||||||
|
if (value == null) {
|
||||||
|
return double.negativeInfinity;
|
||||||
|
} else if (value == true) {
|
||||||
|
return 1;
|
||||||
|
} else if (value == false) {
|
||||||
|
return 0;
|
||||||
|
} else if (value is DateTime) {
|
||||||
|
return value.toUtc().millisecondsSinceEpoch;
|
||||||
|
} else if (value is List) {
|
||||||
|
return value.map(_valueToJs).toList();
|
||||||
|
} else {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
IdWhereClauseJs _buildIdWhereClause(IdWhereClause wc) {
|
||||||
|
return IdWhereClauseJs()
|
||||||
|
..range = _buildKeyRange(
|
||||||
|
wc.lower,
|
||||||
|
wc.upper,
|
||||||
|
wc.includeLower,
|
||||||
|
wc.includeUpper,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
IndexWhereClauseJs _buildIndexWhereClause(
|
||||||
|
CollectionSchema<dynamic> schema,
|
||||||
|
IndexWhereClause wc,
|
||||||
|
) {
|
||||||
|
final index = schema.index(wc.indexName);
|
||||||
|
|
||||||
|
final lower = wc.lower?.toList();
|
||||||
|
final upper = wc.upper?.toList();
|
||||||
|
if (upper != null) {
|
||||||
|
while (index.properties.length > upper.length) {
|
||||||
|
upper.add([]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
dynamic lowerUnwrapped = wc.lower;
|
||||||
|
if (index.properties.length == 1 && lower != null) {
|
||||||
|
lowerUnwrapped = lower.isNotEmpty ? lower[0] : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
dynamic upperUnwrapped = upper;
|
||||||
|
if (index.properties.length == 1 && upper != null) {
|
||||||
|
upperUnwrapped = upper.isNotEmpty ? upper[0] : double.infinity;
|
||||||
|
}
|
||||||
|
|
||||||
|
return IndexWhereClauseJs()
|
||||||
|
..indexName = wc.indexName
|
||||||
|
..range = _buildKeyRange(
|
||||||
|
wc.lower != null ? _valueToJs(lowerUnwrapped) : null,
|
||||||
|
wc.upper != null ? _valueToJs(upperUnwrapped) : null,
|
||||||
|
wc.includeLower,
|
||||||
|
wc.includeUpper,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
LinkWhereClauseJs _buildLinkWhereClause(
|
||||||
|
IsarCollectionImpl<dynamic> col,
|
||||||
|
LinkWhereClause wc,
|
||||||
|
) {
|
||||||
|
// ignore: unused_local_variable
|
||||||
|
final linkCol = col.isar.getCollectionByNameInternal(wc.linkCollection)!
|
||||||
|
as IsarCollectionImpl;
|
||||||
|
//final backlinkLinkName = linkCol.schema.backlinkLinkNames[wc.linkName];
|
||||||
|
return LinkWhereClauseJs()
|
||||||
|
..linkCollection = wc.linkCollection
|
||||||
|
//..linkName = backlinkLinkName ?? wc.linkName
|
||||||
|
//..backlink = backlinkLinkName != null
|
||||||
|
..id = wc.id;
|
||||||
|
}
|
||||||
|
|
||||||
|
KeyRange? _buildKeyRange(
|
||||||
|
dynamic lower,
|
||||||
|
dynamic upper,
|
||||||
|
bool includeLower,
|
||||||
|
bool includeUpper,
|
||||||
|
) {
|
||||||
|
if (lower != null) {
|
||||||
|
if (upper != null) {
|
||||||
|
final boundsEqual = idbCmp(lower, upper) == 0;
|
||||||
|
if (boundsEqual) {
|
||||||
|
if (includeLower && includeUpper) {
|
||||||
|
return KeyRange.only(lower);
|
||||||
|
} else {
|
||||||
|
// empty range
|
||||||
|
return KeyRange.upperBound(double.negativeInfinity, true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return KeyRange.bound(
|
||||||
|
lower,
|
||||||
|
upper,
|
||||||
|
!includeLower,
|
||||||
|
!includeUpper,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
return KeyRange.lowerBound(lower, !includeLower);
|
||||||
|
}
|
||||||
|
} else if (upper != null) {
|
||||||
|
return KeyRange.upperBound(upper, !includeUpper);
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
FilterJs? _buildFilter(
|
||||||
|
CollectionSchema<dynamic> schema,
|
||||||
|
FilterOperation filter,
|
||||||
|
) {
|
||||||
|
final filterStr = _buildFilterOperation(schema, filter);
|
||||||
|
if (filterStr != null) {
|
||||||
|
return FilterJs('id', 'obj', 'return $filterStr');
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
String? _buildFilterOperation(
|
||||||
|
CollectionSchema<dynamic> schema,
|
||||||
|
FilterOperation filter,
|
||||||
|
) {
|
||||||
|
if (filter is FilterGroup) {
|
||||||
|
return _buildFilterGroup(schema, filter);
|
||||||
|
} else if (filter is LinkFilter) {
|
||||||
|
unsupportedOnWeb();
|
||||||
|
} else if (filter is FilterCondition) {
|
||||||
|
return _buildCondition(schema, filter);
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
String? _buildFilterGroup(CollectionSchema<dynamic> schema, FilterGroup group) {
|
||||||
|
final builtConditions = group.filters
|
||||||
|
.map((op) => _buildFilterOperation(schema, op))
|
||||||
|
.where((e) => e != null)
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
if (builtConditions.isEmpty) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (group.type == FilterGroupType.not) {
|
||||||
|
return '!(${builtConditions[0]})';
|
||||||
|
} else if (builtConditions.length == 1) {
|
||||||
|
return builtConditions[0];
|
||||||
|
} else if (group.type == FilterGroupType.xor) {
|
||||||
|
final conditions = builtConditions.join(',');
|
||||||
|
return 'IsarQuery.xor($conditions)';
|
||||||
|
} else {
|
||||||
|
final op = group.type == FilterGroupType.or ? '||' : '&&';
|
||||||
|
final condition = builtConditions.join(op);
|
||||||
|
return '($condition)';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
String _buildCondition(
|
||||||
|
CollectionSchema<dynamic> schema,
|
||||||
|
FilterCondition condition,
|
||||||
|
) {
|
||||||
|
dynamic _prepareFilterValue(dynamic value) {
|
||||||
|
if (value == null) {
|
||||||
|
return null;
|
||||||
|
} else if (value is String) {
|
||||||
|
return stringify(value);
|
||||||
|
} else {
|
||||||
|
return _valueToJs(value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final isListOp = condition.type != FilterConditionType.isNull &&
|
||||||
|
condition.type != FilterConditionType.listLength &&
|
||||||
|
schema.property(condition.property).type.isList;
|
||||||
|
final accessor =
|
||||||
|
condition.property == schema.idName ? 'id' : 'obj.${condition.property}';
|
||||||
|
final variable = isListOp ? 'e' : accessor;
|
||||||
|
|
||||||
|
final cond = _buildConditionInternal(
|
||||||
|
conditionType: condition.type,
|
||||||
|
variable: variable,
|
||||||
|
val1: _prepareFilterValue(condition.value1),
|
||||||
|
include1: condition.include1,
|
||||||
|
val2: _prepareFilterValue(condition.value2),
|
||||||
|
include2: condition.include2,
|
||||||
|
caseSensitive: condition.caseSensitive,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (isListOp) {
|
||||||
|
return '(Array.isArray($accessor) && $accessor.some(e => $cond))';
|
||||||
|
} else {
|
||||||
|
return cond;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
String _buildConditionInternal({
|
||||||
|
required FilterConditionType conditionType,
|
||||||
|
required String variable,
|
||||||
|
required Object? val1,
|
||||||
|
required bool include1,
|
||||||
|
required Object? val2,
|
||||||
|
required bool include2,
|
||||||
|
required bool caseSensitive,
|
||||||
|
}) {
|
||||||
|
final isNull = '($variable == null || $variable === -Infinity)';
|
||||||
|
switch (conditionType) {
|
||||||
|
case FilterConditionType.equalTo:
|
||||||
|
if (val1 == null) {
|
||||||
|
return isNull;
|
||||||
|
} else if (val1 is String && !caseSensitive) {
|
||||||
|
return '$variable?.toLowerCase() === ${val1.toLowerCase()}';
|
||||||
|
} else {
|
||||||
|
return '$variable === $val1';
|
||||||
|
}
|
||||||
|
case FilterConditionType.between:
|
||||||
|
final val = val1 ?? val2;
|
||||||
|
final lowerOp = include1 ? '>=' : '>';
|
||||||
|
final upperOp = include2 ? '<=' : '<';
|
||||||
|
if (val == null) {
|
||||||
|
return isNull;
|
||||||
|
} else if ((val1 is String?) && (val2 is String?) && !caseSensitive) {
|
||||||
|
final lower = val1?.toLowerCase() ?? '-Infinity';
|
||||||
|
final upper = val2?.toLowerCase() ?? '-Infinity';
|
||||||
|
final variableLc = '$variable?.toLowerCase() ?? -Infinity';
|
||||||
|
final lowerCond = 'indexedDB.cmp($variableLc, $lower) $lowerOp 0';
|
||||||
|
final upperCond = 'indexedDB.cmp($variableLc, $upper) $upperOp 0';
|
||||||
|
return '($lowerCond && $upperCond)';
|
||||||
|
} else {
|
||||||
|
final lowerCond =
|
||||||
|
'indexedDB.cmp($variable, ${val1 ?? '-Infinity'}) $lowerOp 0';
|
||||||
|
final upperCond =
|
||||||
|
'indexedDB.cmp($variable, ${val2 ?? '-Infinity'}) $upperOp 0';
|
||||||
|
return '($lowerCond && $upperCond)';
|
||||||
|
}
|
||||||
|
case FilterConditionType.lessThan:
|
||||||
|
if (val1 == null) {
|
||||||
|
if (include1) {
|
||||||
|
return isNull;
|
||||||
|
} else {
|
||||||
|
return 'false';
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
final op = include1 ? '<=' : '<';
|
||||||
|
if (val1 is String && !caseSensitive) {
|
||||||
|
return 'indexedDB.cmp($variable?.toLowerCase() ?? '
|
||||||
|
'-Infinity, ${val1.toLowerCase()}) $op 0';
|
||||||
|
} else {
|
||||||
|
return 'indexedDB.cmp($variable, $val1) $op 0';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case FilterConditionType.greaterThan:
|
||||||
|
if (val1 == null) {
|
||||||
|
if (include1) {
|
||||||
|
return 'true';
|
||||||
|
} else {
|
||||||
|
return '!$isNull';
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
final op = include1 ? '>=' : '>';
|
||||||
|
if (val1 is String && !caseSensitive) {
|
||||||
|
return 'indexedDB.cmp($variable?.toLowerCase() ?? '
|
||||||
|
'-Infinity, ${val1.toLowerCase()}) $op 0';
|
||||||
|
} else {
|
||||||
|
return 'indexedDB.cmp($variable, $val1) $op 0';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case FilterConditionType.startsWith:
|
||||||
|
case FilterConditionType.endsWith:
|
||||||
|
case FilterConditionType.contains:
|
||||||
|
final op = conditionType == FilterConditionType.startsWith
|
||||||
|
? 'startsWith'
|
||||||
|
: conditionType == FilterConditionType.endsWith
|
||||||
|
? 'endsWith'
|
||||||
|
: 'includes';
|
||||||
|
if (val1 is String) {
|
||||||
|
final isString = 'typeof $variable == "string"';
|
||||||
|
if (!caseSensitive) {
|
||||||
|
return '($isString && $variable.toLowerCase() '
|
||||||
|
'.$op(${val1.toLowerCase()}))';
|
||||||
|
} else {
|
||||||
|
return '($isString && $variable.$op($val1))';
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
throw IsarError('Unsupported type for condition');
|
||||||
|
}
|
||||||
|
case FilterConditionType.matches:
|
||||||
|
throw UnimplementedError();
|
||||||
|
case FilterConditionType.isNull:
|
||||||
|
return isNull;
|
||||||
|
// ignore: no_default_cases
|
||||||
|
default:
|
||||||
|
throw UnimplementedError();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
SortCmpJs _buildSort(List<SortProperty> properties) {
|
||||||
|
final sort = properties.map((e) {
|
||||||
|
final op = e.sort == Sort.asc ? '' : '-';
|
||||||
|
return '${op}indexedDB.cmp(a.${e.property} ?? "-Infinity", b.${e.property} '
|
||||||
|
'?? "-Infinity")';
|
||||||
|
}).join('||');
|
||||||
|
return SortCmpJs('a', 'b', 'return $sort');
|
||||||
|
}
|
||||||
|
|
||||||
|
DistinctValueJs _buildDistinct(List<DistinctProperty> properties) {
|
||||||
|
final distinct = properties.map((e) {
|
||||||
|
if (e.caseSensitive == false) {
|
||||||
|
return 'obj.${e.property}?.toLowerCase() ?? "-Infinity"';
|
||||||
|
} else {
|
||||||
|
return 'obj.${e.property}?.toString() ?? "-Infinity"';
|
||||||
|
}
|
||||||
|
}).join('+');
|
||||||
|
return DistinctValueJs('obj', 'return $distinct');
|
||||||
|
}
|
180
lib/src/web/query_impl.dart
Normal file
180
lib/src/web/query_impl.dart
Normal file
@ -0,0 +1,180 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'dart:async';
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:js';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/web/bindings.dart';
|
||||||
|
|
||||||
|
import 'package:isar/src/web/isar_collection_impl.dart';
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
|
||||||
|
typedef QueryDeserialize<T> = T Function(Object);
|
||||||
|
|
||||||
|
class QueryImpl<T> extends Query<T> {
|
||||||
|
QueryImpl(this.col, this.queryJs, this.deserialize, this.propertyName);
|
||||||
|
final IsarCollectionImpl<dynamic> col;
|
||||||
|
final QueryJs queryJs;
|
||||||
|
final QueryDeserialize<T> deserialize;
|
||||||
|
final String? propertyName;
|
||||||
|
|
||||||
|
@override
|
||||||
|
Isar get isar => col.isar;
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<T?> findFirst() {
|
||||||
|
return col.isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final result = await queryJs.findFirst(txn).wait<Object?>();
|
||||||
|
if (result == null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return deserialize(result);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
T? findFirstSync() => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<T>> findAll() {
|
||||||
|
return col.isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final result = await queryJs.findAll(txn).wait<List<dynamic>>();
|
||||||
|
return result.map((e) => deserialize(e as Object)).toList();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<T> findAllSync() => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<R?> aggregate<R>(AggregationOp op) {
|
||||||
|
return col.isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final property = propertyName ?? col.schema.idName;
|
||||||
|
|
||||||
|
num? result;
|
||||||
|
switch (op) {
|
||||||
|
case AggregationOp.min:
|
||||||
|
result = await queryJs.min(txn, property).wait();
|
||||||
|
break;
|
||||||
|
case AggregationOp.max:
|
||||||
|
result = await queryJs.max(txn, property).wait();
|
||||||
|
break;
|
||||||
|
case AggregationOp.sum:
|
||||||
|
result = await queryJs.sum(txn, property).wait();
|
||||||
|
break;
|
||||||
|
case AggregationOp.average:
|
||||||
|
result = await queryJs.average(txn, property).wait();
|
||||||
|
break;
|
||||||
|
case AggregationOp.count:
|
||||||
|
result = await queryJs.count(txn).wait();
|
||||||
|
break;
|
||||||
|
// ignore: no_default_cases
|
||||||
|
default:
|
||||||
|
throw UnimplementedError();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (result == null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (R == DateTime) {
|
||||||
|
return DateTime.fromMillisecondsSinceEpoch(result.toInt()).toLocal()
|
||||||
|
as R;
|
||||||
|
} else if (R == int) {
|
||||||
|
return result.toInt() as R;
|
||||||
|
} else if (R == double) {
|
||||||
|
return result.toDouble() as R;
|
||||||
|
} else {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
R? aggregateSync<R>(AggregationOp op) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<bool> deleteFirst() {
|
||||||
|
return col.isar.getTxn(true, (IsarTxnJs txn) {
|
||||||
|
return queryJs.deleteFirst(txn).wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
bool deleteFirstSync() => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<int> deleteAll() {
|
||||||
|
return col.isar.getTxn(true, (IsarTxnJs txn) {
|
||||||
|
return queryJs.deleteAll(txn).wait();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
int deleteAllSync() => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<List<T>> watch({bool fireImmediately = false}) {
|
||||||
|
JsFunction? stop;
|
||||||
|
final controller = StreamController<List<T>>(
|
||||||
|
onCancel: () {
|
||||||
|
stop?.apply([]);
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (fireImmediately) {
|
||||||
|
findAll().then(controller.add);
|
||||||
|
}
|
||||||
|
|
||||||
|
final Null Function(List<dynamic> results) callback =
|
||||||
|
allowInterop((List<dynamic> results) {
|
||||||
|
controller.add(results.map((e) => deserialize(e as Object)).toList());
|
||||||
|
});
|
||||||
|
stop = col.native.watchQuery(queryJs, callback);
|
||||||
|
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Stream<void> watchLazy({bool fireImmediately = false}) {
|
||||||
|
JsFunction? stop;
|
||||||
|
final controller = StreamController<void>(
|
||||||
|
onCancel: () {
|
||||||
|
stop?.apply([]);
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
final Null Function() callback = allowInterop(() {
|
||||||
|
controller.add(null);
|
||||||
|
});
|
||||||
|
stop = col.native.watchQueryLazy(queryJs, callback);
|
||||||
|
|
||||||
|
return controller.stream;
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<R> exportJsonRaw<R>(R Function(Uint8List) callback) async {
|
||||||
|
return col.isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final result = await queryJs.findAll(txn).wait<dynamic>();
|
||||||
|
final jsonStr = stringify(result);
|
||||||
|
return callback(const Utf8Encoder().convert(jsonStr));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
Future<List<Map<String, dynamic>>> exportJson() {
|
||||||
|
return col.isar.getTxn(false, (IsarTxnJs txn) async {
|
||||||
|
final result = await queryJs.findAll(txn).wait<List<dynamic>>();
|
||||||
|
return result.map((e) => jsMapToDart(e as Object)).toList();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@override
|
||||||
|
R exportJsonRawSync<R>(R Function(Uint8List) callback) => unsupportedOnWeb();
|
||||||
|
|
||||||
|
@override
|
||||||
|
List<Map<String, dynamic>> exportJsonSync({bool primitiveNull = true}) =>
|
||||||
|
unsupportedOnWeb();
|
||||||
|
}
|
5
lib/src/web/split_words.dart
Normal file
5
lib/src/web/split_words.dart
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
// ignore_for_file: public_member_api_docs
|
||||||
|
|
||||||
|
import 'package:isar/src/web/isar_web.dart';
|
||||||
|
|
||||||
|
List<String> isarSplitWords(String input) => unsupportedOnWeb();
|
22
pubspec.yaml
Normal file
22
pubspec.yaml
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
name: isar
|
||||||
|
description: Extremely fast, easy to use, and fully async NoSQL database for Flutter.
|
||||||
|
version: 3.1.0+1
|
||||||
|
repository: https://github.com/isar/isar/tree/main/packages/isar
|
||||||
|
homepage: https://github.com/isar/isar
|
||||||
|
issue_tracker: https://github.com/isar/isar/issues
|
||||||
|
documentation: https://isar.dev
|
||||||
|
funding:
|
||||||
|
- https://github.com/sponsors/leisim/
|
||||||
|
|
||||||
|
environment:
|
||||||
|
sdk: ">=2.17.0 <3.0.0"
|
||||||
|
|
||||||
|
dependencies:
|
||||||
|
ffi: ">=2.0.0 <3.0.0"
|
||||||
|
js: ^0.6.4
|
||||||
|
meta: ^1.7.0
|
||||||
|
|
||||||
|
dev_dependencies:
|
||||||
|
ffigen: ">=6.1.2 <8.0.0"
|
||||||
|
test: ^1.21.1
|
||||||
|
very_good_analysis: ^3.0.1
|
287
test/isar_reader_writer_test.dart
Normal file
287
test/isar_reader_writer_test.dart
Normal file
@ -0,0 +1,287 @@
|
|||||||
|
@TestOn('vm')
|
||||||
|
|
||||||
|
// ignore_for_file: constant_identifier_names
|
||||||
|
|
||||||
|
import 'dart:convert';
|
||||||
|
import 'dart:io';
|
||||||
|
import 'dart:typed_data';
|
||||||
|
|
||||||
|
import 'package:isar/isar.dart';
|
||||||
|
import 'package:isar/src/native/isar_core.dart';
|
||||||
|
import 'package:isar/src/native/isar_reader_impl.dart';
|
||||||
|
import 'package:isar/src/native/isar_writer_impl.dart';
|
||||||
|
import 'package:test/test.dart';
|
||||||
|
|
||||||
|
void main() {
|
||||||
|
group('Golden Binary', () {
|
||||||
|
late final json =
|
||||||
|
File('../isar_core/tests/binary_golden.json').readAsStringSync();
|
||||||
|
late final tests = (jsonDecode(json) as List<dynamic>)
|
||||||
|
.map((e) => BinaryTest.fromJson(e as Map<String, dynamic>))
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
test('IsarReader', () {
|
||||||
|
var t = 0;
|
||||||
|
for (final test in tests) {
|
||||||
|
final reader = IsarReaderImpl(Uint8List.fromList(test.bytes));
|
||||||
|
var offset = 2;
|
||||||
|
for (var i = 0; i < test.types.length; i++) {
|
||||||
|
final type = test.types[i];
|
||||||
|
final nullableValue = type.read(reader, offset, true);
|
||||||
|
expect(nullableValue, test.values[i], reason: '${test.types} $t');
|
||||||
|
|
||||||
|
final nonNullableValue = type.read(reader, offset, false);
|
||||||
|
_expectIgnoreNull(nonNullableValue, test.values[i], type);
|
||||||
|
offset += type.size;
|
||||||
|
}
|
||||||
|
t++;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
test('IsarWriter', () {
|
||||||
|
for (final test in tests) {
|
||||||
|
final buffer = Uint8List(10000);
|
||||||
|
final size =
|
||||||
|
test.types.fold<int>(0, (sum, type) => sum + type.size) + 2;
|
||||||
|
|
||||||
|
final bufferView = buffer.buffer.asUint8List(0, test.bytes.length);
|
||||||
|
final writer = IsarWriterImpl(bufferView, size);
|
||||||
|
var offset = 2;
|
||||||
|
for (var i = 0; i < test.types.length; i++) {
|
||||||
|
final type = test.types[i];
|
||||||
|
final value = test.values[i];
|
||||||
|
type.write(writer, offset, value);
|
||||||
|
offset += type.size;
|
||||||
|
}
|
||||||
|
|
||||||
|
expect(buffer.sublist(0, test.bytes.length), test.bytes);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
enum Type {
|
||||||
|
Bool(1, false, _readBool, _writeBool),
|
||||||
|
Byte(1, 0, _readByte, _writeByte),
|
||||||
|
Int(4, nullInt, _readInt, _writeInt),
|
||||||
|
Float(4, nullFloat, _readFloat, _writeFloat),
|
||||||
|
Long(8, nullLong, _readLong, _writeLong),
|
||||||
|
Double(8, nullDouble, _readDouble, _writeDouble),
|
||||||
|
String(3, '', _readString, _writeString),
|
||||||
|
BoolList(3, false, _readBoolList, _writeBoolList),
|
||||||
|
ByteList(3, 0, _readByteList, _writeByteList),
|
||||||
|
IntList(3, nullInt, _readIntList, _writeIntList),
|
||||||
|
FloatList(3, nullFloat, _readFloatList, _writeFloatList),
|
||||||
|
LongList(3, nullLong, _readLongList, _writeLongList),
|
||||||
|
DoubleList(3, nullDouble, _readDoubleList, _writeDoubleList),
|
||||||
|
StringList(3, '', _readStringList, _writeStringList);
|
||||||
|
|
||||||
|
const Type(this.size, this.nullValue, this.read, this.write);
|
||||||
|
|
||||||
|
final int size;
|
||||||
|
final dynamic nullValue;
|
||||||
|
final dynamic Function(IsarReader reader, int offset, bool nullable) read;
|
||||||
|
final void Function(IsarWriter reader, int offset, dynamic value) write;
|
||||||
|
}
|
||||||
|
|
||||||
|
class BinaryTest {
|
||||||
|
const BinaryTest(this.types, this.values, this.bytes);
|
||||||
|
|
||||||
|
factory BinaryTest.fromJson(Map<String, dynamic> json) {
|
||||||
|
return BinaryTest(
|
||||||
|
(json['types'] as List)
|
||||||
|
.map((type) => Type.values.firstWhere((t) => t.name == type))
|
||||||
|
.toList(),
|
||||||
|
json['values'] as List,
|
||||||
|
(json['bytes'] as List).cast(),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
final List<Type> types;
|
||||||
|
final List<dynamic> values;
|
||||||
|
final List<int> bytes;
|
||||||
|
}
|
||||||
|
|
||||||
|
void _expectIgnoreNull(
|
||||||
|
dynamic left,
|
||||||
|
dynamic right,
|
||||||
|
Type type, {
|
||||||
|
bool inList = false,
|
||||||
|
}) {
|
||||||
|
if (right == null && (type.index < Type.BoolList.index || inList)) {
|
||||||
|
if (left is double) {
|
||||||
|
expect(left, isNaN);
|
||||||
|
} else {
|
||||||
|
expect(left, type.nullValue);
|
||||||
|
}
|
||||||
|
} else if (right is List) {
|
||||||
|
left as List;
|
||||||
|
for (var i = 0; i < right.length; i++) {
|
||||||
|
_expectIgnoreNull(left[i], right[i], type, inList: true);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
expect(left, right);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
bool? _readBool(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readBoolOrNull(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readBool(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeBool(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeBool(offset, value as bool?);
|
||||||
|
}
|
||||||
|
|
||||||
|
int? _readByte(IsarReader reader, int offset, bool nullable) {
|
||||||
|
return reader.readByte(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeByte(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeByte(offset, value as int);
|
||||||
|
}
|
||||||
|
|
||||||
|
int? _readInt(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readIntOrNull(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readInt(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeInt(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeInt(offset, value as int?);
|
||||||
|
}
|
||||||
|
|
||||||
|
double? _readFloat(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readFloatOrNull(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readFloat(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeFloat(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeFloat(offset, value as double?);
|
||||||
|
}
|
||||||
|
|
||||||
|
int? _readLong(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readLongOrNull(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readLong(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeLong(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeLong(offset, value as int?);
|
||||||
|
}
|
||||||
|
|
||||||
|
double? _readDouble(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readDoubleOrNull(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readDouble(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeDouble(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeDouble(offset, value as double?);
|
||||||
|
}
|
||||||
|
|
||||||
|
String? _readString(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readStringOrNull(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readString(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeString(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
final bytes = value is String ? utf8.encode(value) as Uint8List : null;
|
||||||
|
writer.writeByteList(offset, bytes);
|
||||||
|
}
|
||||||
|
|
||||||
|
List<bool?>? _readBoolList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readBoolOrNullList(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readBoolList(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeBoolList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeBoolList(offset, (value as List?)?.cast());
|
||||||
|
}
|
||||||
|
|
||||||
|
List<int>? _readByteList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
return reader.readByteList(offset);
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeByteList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
final bytes = value is List ? Uint8List.fromList(value.cast()) : null;
|
||||||
|
writer.writeByteList(offset, bytes);
|
||||||
|
}
|
||||||
|
|
||||||
|
List<int?>? _readIntList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readIntOrNullList(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readIntList(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeIntList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeIntList(offset, (value as List?)?.cast());
|
||||||
|
}
|
||||||
|
|
||||||
|
List<double?>? _readFloatList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readFloatOrNullList(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readFloatList(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeFloatList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeFloatList(offset, (value as List?)?.cast());
|
||||||
|
}
|
||||||
|
|
||||||
|
List<int?>? _readLongList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readLongOrNullList(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readLongList(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeLongList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeLongList(offset, (value as List?)?.cast());
|
||||||
|
}
|
||||||
|
|
||||||
|
List<double?>? _readDoubleList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readDoubleOrNullList(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readDoubleList(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeDoubleList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeDoubleList(offset, (value as List?)?.cast());
|
||||||
|
}
|
||||||
|
|
||||||
|
List<String?>? _readStringList(IsarReader reader, int offset, bool nullable) {
|
||||||
|
if (nullable) {
|
||||||
|
return reader.readStringOrNullList(offset);
|
||||||
|
} else {
|
||||||
|
return reader.readStringList(offset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void _writeStringList(IsarWriter writer, int offset, dynamic value) {
|
||||||
|
writer.writeStringList(offset, (value as List?)?.cast());
|
||||||
|
}
|
6
tool/get_version.dart
Normal file
6
tool/get_version.dart
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
void main() {
|
||||||
|
// ignore: avoid_print
|
||||||
|
print(Isar.version);
|
||||||
|
}
|
9
tool/verify_release_version.dart
Normal file
9
tool/verify_release_version.dart
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
import 'package:isar/isar.dart';
|
||||||
|
|
||||||
|
void main(List<String> args) {
|
||||||
|
if (Isar.version != args[0]) {
|
||||||
|
throw StateError(
|
||||||
|
'Invalid Isar version for release: ${Isar.version} != ${args[0]}',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
Loading…
x
Reference in New Issue
Block a user