psygnal-0.15.0/CHANGELOG.md0000644000000000000000000013166315073705675012042 0ustar00# Changelog ## [v0.15.0](https://github.com/pyapp-kit/psygnal/tree/v0.15.0) (2025-10-15) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.14.2...v0.15.0) **Merged pull requests:** - build: drop 3.9, support 3.14 [\#391](https://github.com/pyapp-kit/psygnal/pull/391) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump astral-sh/setup-uv from 6 to 7 [\#390](https://github.com/pyapp-kit/psygnal/pull/390) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(pre-commit.ci\): autoupdate [\#389](https://github.com/pyapp-kit/psygnal/pull/389) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump pypa/cibuildwheel from 3.1 to 3.2 [\#388](https://github.com/pyapp-kit/psygnal/pull/388) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.14.2](https://github.com/pyapp-kit/psygnal/tree/v0.14.2) (2025-09-24) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.14.1...v0.14.2) **Implemented enhancements:** - feat: Allow to replace selection content in with one event emitted [\#387](https://github.com/pyapp-kit/psygnal/pull/387) ([Czaki](https://github.com/Czaki)) **Merged pull requests:** - ci\(dependabot\): bump actions/setup-python from 5 to 6 [\#386](https://github.com/pyapp-kit/psygnal/pull/386) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump CodSpeedHQ/action from 3 to 4 [\#385](https://github.com/pyapp-kit/psygnal/pull/385) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(pre-commit.ci\): autoupdate [\#384](https://github.com/pyapp-kit/psygnal/pull/384) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump actions/checkout from 4 to 5 [\#383](https://github.com/pyapp-kit/psygnal/pull/383) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.14.1](https://github.com/pyapp-kit/psygnal/tree/v0.14.1) (2025-08-12) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.14.0...v0.14.1) **Implemented enhancements:** - feat: allow SignalInstances for evented dataclass fields to emit on field mutation \(in addition to field change\) [\#379](https://github.com/pyapp-kit/psygnal/pull/379) ([tlambert03](https://github.com/tlambert03)) - feat: allow psygnal.testing functions to to accept connection kwargs [\#378](https://github.com/pyapp-kit/psygnal/pull/378) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - ci\(dependabot\): bump actions/download-artifact from 4 to 5 [\#382](https://github.com/pyapp-kit/psygnal/pull/382) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 3.0 to 3.1 [\#381](https://github.com/pyapp-kit/psygnal/pull/381) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(pre-commit.ci\): autoupdate [\#380](https://github.com/pyapp-kit/psygnal/pull/380) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) ## [v0.14.0](https://github.com/pyapp-kit/psygnal/tree/v0.14.0) (2025-07-01) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.13.0...v0.14.0) **Implemented enhancements:** - feat: support coroutine functions [\#346](https://github.com/pyapp-kit/psygnal/pull/346) ([tlambert03](https://github.com/tlambert03)) - feat: support pydantic @computed\_fields, and arbitrary `field_dependents` for event emission [\#340](https://github.com/pyapp-kit/psygnal/pull/340) ([tlambert03](https://github.com/tlambert03)) - feat: Support bubbling up of events from evented children on dataclasses [\#298](https://github.com/pyapp-kit/psygnal/pull/298) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: fix evented container pydantic serialization [\#377](https://github.com/pyapp-kit/psygnal/pull/377) ([tlambert03](https://github.com/tlambert03)) - fix: fix typing in psygnal testing [\#374](https://github.com/pyapp-kit/psygnal/pull/374) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - chore: Update mypyc build [\#373](https://github.com/pyapp-kit/psygnal/pull/373) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.23 to 3.0 [\#372](https://github.com/pyapp-kit/psygnal/pull/372) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.13.0](https://github.com/pyapp-kit/psygnal/tree/v0.13.0) (2025-05-05) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.12.0...v0.13.0) **Implemented enhancements:** - feat: add testing utilities [\#368](https://github.com/pyapp-kit/psygnal/pull/368) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: Don't use deprecated model\_fields access [\#364](https://github.com/pyapp-kit/psygnal/pull/364) ([s-t-e-v-e-n-k](https://github.com/s-t-e-v-e-n-k)) **Merged pull requests:** - build: fix building of wheels with uv [\#370](https://github.com/pyapp-kit/psygnal/pull/370) ([tlambert03](https://github.com/tlambert03)) - ci\(pre-commit.ci\): autoupdate [\#369](https://github.com/pyapp-kit/psygnal/pull/369) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - docs: general docs update, use mkdocs-api-autonav [\#367](https://github.com/pyapp-kit/psygnal/pull/367) ([tlambert03](https://github.com/tlambert03)) - build: use pyproject dependency groups and uv [\#366](https://github.com/pyapp-kit/psygnal/pull/366) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.22 to 2.23 [\#360](https://github.com/pyapp-kit/psygnal/pull/360) ([dependabot[bot]](https://github.com/apps/dependabot)) - build: Add back universal \(none-any\) wheel [\#358](https://github.com/pyapp-kit/psygnal/pull/358) ([tlambert03](https://github.com/tlambert03)) - ci\(pre-commit.ci\): autoupdate [\#355](https://github.com/pyapp-kit/psygnal/pull/355) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) ## [v0.12.0](https://github.com/pyapp-kit/psygnal/tree/v0.12.0) (2025-02-03) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.11.1...v0.12.0) **Implemented enhancements:** - feat: add description to signalinstance [\#339](https://github.com/pyapp-kit/psygnal/pull/339) ([tlambert03](https://github.com/tlambert03)) - perf: add `emit_fast` method for 10x faster emission \(without safety checks\) [\#331](https://github.com/pyapp-kit/psygnal/pull/331) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: use safer repr in warning [\#353](https://github.com/pyapp-kit/psygnal/pull/353) ([tlambert03](https://github.com/tlambert03)) - fix: fix use of computed\_field setter with field\_dependencies [\#336](https://github.com/pyapp-kit/psygnal/pull/336) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - ci: fix wheel building [\#352](https://github.com/pyapp-kit/psygnal/pull/352) ([tlambert03](https://github.com/tlambert03)) - ci: fixing-tests for pyside6 [\#314](https://github.com/pyapp-kit/psygnal/pull/314) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - refactor: update exception message [\#351](https://github.com/pyapp-kit/psygnal/pull/351) ([tlambert03](https://github.com/tlambert03)) - docs: modified quickstart simple example [\#349](https://github.com/pyapp-kit/psygnal/pull/349) ([roynielsen17](https://github.com/roynielsen17)) - ci\(pre-commit.ci\): autoupdate [\#344](https://github.com/pyapp-kit/psygnal/pull/344) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.21.3 to 2.22.0 [\#342](https://github.com/pyapp-kit/psygnal/pull/342) ([dependabot[bot]](https://github.com/apps/dependabot)) - build: drop python 3.8 [\#341](https://github.com/pyapp-kit/psygnal/pull/341) ([tlambert03](https://github.com/tlambert03)) - Update EventedModel docs and errors to refer `field_dependencies` [\#335](https://github.com/pyapp-kit/psygnal/pull/335) ([sjdemartini](https://github.com/sjdemartini)) - bug: change stack level on warning for \_check\_nargs [\#333](https://github.com/pyapp-kit/psygnal/pull/333) ([tlambert03](https://github.com/tlambert03)) - ci\(pre-commit.ci\): autoupdate [\#332](https://github.com/pyapp-kit/psygnal/pull/332) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.21.0 to 2.21.3 [\#329](https://github.com/pyapp-kit/psygnal/pull/329) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.20.0 to 2.21.0 [\#324](https://github.com/pyapp-kit/psygnal/pull/324) ([dependabot[bot]](https://github.com/apps/dependabot)) - CI: Update host OS version, add Python 3.13 [\#323](https://github.com/pyapp-kit/psygnal/pull/323) ([EwoutH](https://github.com/EwoutH)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.19.2 to 2.20.0 [\#322](https://github.com/pyapp-kit/psygnal/pull/322) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump CodSpeedHQ/action from 2 to 3 [\#321](https://github.com/pyapp-kit/psygnal/pull/321) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.18.1 to 2.19.2 [\#318](https://github.com/pyapp-kit/psygnal/pull/318) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(pre-commit.ci\): autoupdate [\#317](https://github.com/pyapp-kit/psygnal/pull/317) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(pre-commit.ci\): autoupdate [\#313](https://github.com/pyapp-kit/psygnal/pull/313) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.18.0 to 2.18.1 [\#312](https://github.com/pyapp-kit/psygnal/pull/312) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump softprops/action-gh-release from 1 to 2 [\#310](https://github.com/pyapp-kit/psygnal/pull/310) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.11.1](https://github.com/pyapp-kit/psygnal/tree/v0.11.1) (2024-05-07) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.11.0...v0.11.1) **Merged pull requests:** - perf: let EventedSet use clear\(\) method of underlying set [\#307](https://github.com/pyapp-kit/psygnal/pull/307) ([DanGonite57](https://github.com/DanGonite57)) ## [v0.11.0](https://github.com/pyapp-kit/psygnal/tree/v0.11.0) (2024-03-29) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.10.2...v0.11.0) **Implemented enhancements:** - refactor: change EmitLoopError message, and mechanism of info gathering [\#302](https://github.com/pyapp-kit/psygnal/pull/302) ([tlambert03](https://github.com/tlambert03)) - feat: add signal aliases on SignalGroup [\#299](https://github.com/pyapp-kit/psygnal/pull/299) ([getzze](https://github.com/getzze)) - feat!: Rename `recursion_mode` to `reemission`. Rename `deferred` to `queued`. Add `latest-only` mode. \(technically breaking\) [\#296](https://github.com/pyapp-kit/psygnal/pull/296) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - ci\(dependabot\): bump pypa/cibuildwheel from 2.16.5 to 2.17.0 [\#303](https://github.com/pyapp-kit/psygnal/pull/303) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.10.2](https://github.com/pyapp-kit/psygnal/tree/v0.10.2) (2024-03-12) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.10.1...v0.10.2) **Fixed bugs:** - fix: fix hard reference to objects in emitted arguments [\#301](https://github.com/pyapp-kit/psygnal/pull/301) ([tlambert03](https://github.com/tlambert03)) ## [v0.10.1](https://github.com/pyapp-kit/psygnal/tree/v0.10.1) (2024-03-11) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.10.0...v0.10.1) **Implemented enhancements:** - feat: Add recursion\_mode \('immediate' or 'deferred'\) to Signal and SignalInstance [\#293](https://github.com/pyapp-kit/psygnal/pull/293) ([tlambert03](https://github.com/tlambert03)) - feat: add collect\_fields option to SignalGroupDescriptor, and accept a SignalGroup subclass [\#291](https://github.com/pyapp-kit/psygnal/pull/291) ([getzze](https://github.com/getzze)) **Fixed bugs:** - A bit more consistent SignalGroup iter [\#289](https://github.com/pyapp-kit/psygnal/pull/289) ([getzze](https://github.com/getzze)) **Merged pull requests:** - ci\(dependabot\): bump softprops/action-gh-release from 1 to 2 [\#295](https://github.com/pyapp-kit/psygnal/pull/295) ([dependabot[bot]](https://github.com/apps/dependabot)) - chore: patch asv config to work locally with arm64 macos on hatchling [\#294](https://github.com/pyapp-kit/psygnal/pull/294) ([tlambert03](https://github.com/tlambert03)) ## [v0.10.0](https://github.com/pyapp-kit/psygnal/tree/v0.10.0) (2024-03-05) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.9.5...v0.10.0) **Implemented enhancements:** - feat: add priority to connect, to control callback order [\#285](https://github.com/pyapp-kit/psygnal/pull/285) ([tlambert03](https://github.com/tlambert03)) - feat: support for evented containers as pydantic v2 fields [\#283](https://github.com/pyapp-kit/psygnal/pull/283) ([tlambert03](https://github.com/tlambert03)) - perf: Fixing performance of evented set [\#275](https://github.com/pyapp-kit/psygnal/pull/275) ([Czaki](https://github.com/Czaki)) - refactor!: New SignalGroup that does not subclass SignalInstance [\#269](https://github.com/pyapp-kit/psygnal/pull/269) ([tlambert03](https://github.com/tlambert03)) - feat: emit the old value as second argument in Signals from SignalGroupDescriptor \(evented dataclass\) [\#257](https://github.com/pyapp-kit/psygnal/pull/257) ([getzze](https://github.com/getzze)) **Fixed bugs:** - fix: ensure proper order of signal emission [\#281](https://github.com/pyapp-kit/psygnal/pull/281) ([Czaki](https://github.com/Czaki)) - feat: deduplicate events emission in nested properties [\#279](https://github.com/pyapp-kit/psygnal/pull/279) ([Czaki](https://github.com/Czaki)) - fix: fix connect\_setattr on dataclass field signals [\#258](https://github.com/pyapp-kit/psygnal/pull/258) ([tlambert03](https://github.com/tlambert03)) - fix: add and fix copy operators [\#255](https://github.com/pyapp-kit/psygnal/pull/255) ([Czaki](https://github.com/Czaki)) - fix: fix 3.7 build [\#250](https://github.com/pyapp-kit/psygnal/pull/250) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - test: test for recursion error [\#284](https://github.com/pyapp-kit/psygnal/pull/284) ([tlambert03](https://github.com/tlambert03)) - ci: inherit secrets in reusable workflow [\#266](https://github.com/pyapp-kit/psygnal/pull/266) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - chore: un-deprecate SignalGroup.signals [\#288](https://github.com/pyapp-kit/psygnal/pull/288) ([tlambert03](https://github.com/tlambert03)) - chore: use ruff format instead of black [\#287](https://github.com/pyapp-kit/psygnal/pull/287) ([tlambert03](https://github.com/tlambert03)) - refactor: Add back SignalGroup methods [\#286](https://github.com/pyapp-kit/psygnal/pull/286) ([tlambert03](https://github.com/tlambert03)) - chore: remove asynchronous emit and other deprecations [\#282](https://github.com/pyapp-kit/psygnal/pull/282) ([tlambert03](https://github.com/tlambert03)) - refactor: Unify pydantic evented model modules [\#280](https://github.com/pyapp-kit/psygnal/pull/280) ([tlambert03](https://github.com/tlambert03)) - perf: Do not use reducer if there is no callback in `SignalInstance.resume` [\#278](https://github.com/pyapp-kit/psygnal/pull/278) ([Czaki](https://github.com/Czaki)) - perf: Delay SignalRelay connection to when a callback is connected [\#277](https://github.com/pyapp-kit/psygnal/pull/277) ([tlambert03](https://github.com/tlambert03)) - build: remove all dependencies [\#273](https://github.com/pyapp-kit/psygnal/pull/273) ([tlambert03](https://github.com/tlambert03)) - docs: Update README.md with evented containers [\#272](https://github.com/pyapp-kit/psygnal/pull/272) ([tlambert03](https://github.com/tlambert03)) - docs: Update README.md with `make build` [\#270](https://github.com/pyapp-kit/psygnal/pull/270) ([tlambert03](https://github.com/tlambert03)) - build: Drop python 3.7 [\#268](https://github.com/pyapp-kit/psygnal/pull/268) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.16.4 to 2.16.5 [\#263](https://github.com/pyapp-kit/psygnal/pull/263) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.16.2 to 2.16.4 [\#256](https://github.com/pyapp-kit/psygnal/pull/256) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump actions/cache from 3 to 4 [\#253](https://github.com/pyapp-kit/psygnal/pull/253) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump actions/upload-artifact from 3 to 4 [\#249](https://github.com/pyapp-kit/psygnal/pull/249) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump actions/setup-python from 4 to 5 [\#248](https://github.com/pyapp-kit/psygnal/pull/248) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump CodSpeedHQ/action from 1 to 2 [\#246](https://github.com/pyapp-kit/psygnal/pull/246) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump conda-incubator/setup-miniconda from 2 to 3 [\#245](https://github.com/pyapp-kit/psygnal/pull/245) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci: use reusable ci workflow [\#241](https://github.com/pyapp-kit/psygnal/pull/241) ([tlambert03](https://github.com/tlambert03)) ## [v0.9.5](https://github.com/pyapp-kit/psygnal/tree/v0.9.5) (2023-11-13) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.9.4...v0.9.5) **Implemented enhancements:** - feat: better repr for WeakCallback objects [\#236](https://github.com/pyapp-kit/psygnal/pull/236) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - fix: fix py37 build [\#243](https://github.com/pyapp-kit/psygnal/pull/243) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.16.1 to 2.16.2 [\#240](https://github.com/pyapp-kit/psygnal/pull/240) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.15.0 to 2.16.1 [\#238](https://github.com/pyapp-kit/psygnal/pull/238) ([dependabot[bot]](https://github.com/apps/dependabot)) - refactor: make EmitLoop error message clearer [\#232](https://github.com/pyapp-kit/psygnal/pull/232) ([tlambert03](https://github.com/tlambert03)) ## [v0.9.4](https://github.com/pyapp-kit/psygnal/tree/v0.9.4) (2023-09-19) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.9.3...v0.9.4) **Implemented enhancements:** - perf: don't compare before/after values in evented dataclass/model when no signals connected [\#235](https://github.com/pyapp-kit/psygnal/pull/235) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: emission of events from root validators and extraneous emission of dependent fields [\#234](https://github.com/pyapp-kit/psygnal/pull/234) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - ci\(dependabot\): bump actions/checkout from 3 to 4 [\#231](https://github.com/pyapp-kit/psygnal/pull/231) ([dependabot[bot]](https://github.com/apps/dependabot)) - test: python 3.12 [\#225](https://github.com/pyapp-kit/psygnal/pull/225) ([tlambert03](https://github.com/tlambert03)) ## [v0.9.3](https://github.com/pyapp-kit/psygnal/tree/v0.9.3) (2023-08-15) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.9.2...v0.9.3) **Fixed bugs:** - fix: fix signature inspection on debounced/throttled, update typing and wrapped [\#228](https://github.com/pyapp-kit/psygnal/pull/228) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - build: restrict py versions on cibuildwheel [\#229](https://github.com/pyapp-kit/psygnal/pull/229) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.14.1 to 2.15.0 [\#227](https://github.com/pyapp-kit/psygnal/pull/227) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.9.2](https://github.com/pyapp-kit/psygnal/tree/v0.9.2) (2023-08-12) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.9.1...v0.9.2) **Fixed bugs:** - fix: add deepcopy method for mypyc support, don't copy weakly connected slots [\#222](https://github.com/pyapp-kit/psygnal/pull/222) ([tlambert03](https://github.com/tlambert03)) - Fix imports of typing extensions [\#221](https://github.com/pyapp-kit/psygnal/pull/221) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - ci: fix linux wheels [\#226](https://github.com/pyapp-kit/psygnal/pull/226) ([tlambert03](https://github.com/tlambert03)) - ci: change concurrency [\#224](https://github.com/pyapp-kit/psygnal/pull/224) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - build: remove setuppy [\#223](https://github.com/pyapp-kit/psygnal/pull/223) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.13.1 to 2.14.1 [\#218](https://github.com/pyapp-kit/psygnal/pull/218) ([dependabot[bot]](https://github.com/apps/dependabot)) - fix: fix duplicated derived events [\#216](https://github.com/pyapp-kit/psygnal/pull/216) ([tlambert03](https://github.com/tlambert03)) - feat: support pydantic v2 [\#214](https://github.com/pyapp-kit/psygnal/pull/214) ([tlambert03](https://github.com/tlambert03)) - ci\(pre-commit.ci\): autoupdate [\#213](https://github.com/pyapp-kit/psygnal/pull/213) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.13.0 to 2.13.1 [\#212](https://github.com/pyapp-kit/psygnal/pull/212) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(pre-commit.ci\): autoupdate [\#211](https://github.com/pyapp-kit/psygnal/pull/211) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) ## [v0.9.1](https://github.com/pyapp-kit/psygnal/tree/v0.9.1) (2023-05-29) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.9.0...v0.9.1) **Implemented enhancements:** - feat: Support toolz [\#210](https://github.com/pyapp-kit/psygnal/pull/210) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: better error message with keyword only partials [\#209](https://github.com/pyapp-kit/psygnal/pull/209) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - build: add test dep [\#206](https://github.com/pyapp-kit/psygnal/pull/206) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - ci\(dependabot\): bump pypa/cibuildwheel from 2.12.3 to 2.13.0 [\#207](https://github.com/pyapp-kit/psygnal/pull/207) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(pre-commit.ci\): autoupdate [\#205](https://github.com/pyapp-kit/psygnal/pull/205) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.12.1 to 2.12.3 [\#204](https://github.com/pyapp-kit/psygnal/pull/204) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.9.0](https://github.com/pyapp-kit/psygnal/tree/v0.9.0) (2023-04-07) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.8.1...v0.9.0) **Implemented enhancements:** - feat: add thread parameter to connection method, allowed "queued connections" [\#200](https://github.com/pyapp-kit/psygnal/pull/200) ([tlambert03](https://github.com/tlambert03)) - build: add pyinstaller hook to simplify frozing apps using pyinstaller [\#194](https://github.com/pyapp-kit/psygnal/pull/194) ([Czaki](https://github.com/Czaki)) **Merged pull requests:** - docs: add docs on connecting across thread [\#203](https://github.com/pyapp-kit/psygnal/pull/203) ([tlambert03](https://github.com/tlambert03)) - chore: deprecate async keyword in emit method [\#201](https://github.com/pyapp-kit/psygnal/pull/201) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.12.0 to 2.12.1 [\#197](https://github.com/pyapp-kit/psygnal/pull/197) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump actions/setup-python from 3 to 4 [\#193](https://github.com/pyapp-kit/psygnal/pull/193) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.8.1](https://github.com/pyapp-kit/psygnal/tree/v0.8.1) (2023-02-23) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.8.0...v0.8.1) **Fixed bugs:** - fix: fix strict signal group checking when signatures aren't hashable [\#192](https://github.com/pyapp-kit/psygnal/pull/192) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - test: add back typesafety tests [\#190](https://github.com/pyapp-kit/psygnal/pull/190) ([tlambert03](https://github.com/tlambert03)) ## [v0.8.0](https://github.com/pyapp-kit/psygnal/tree/v0.8.0) (2023-02-23) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.7.2...v0.8.0) **Implemented enhancements:** - feat: compile throttler module, improve typing [\#187](https://github.com/pyapp-kit/psygnal/pull/187) ([tlambert03](https://github.com/tlambert03)) - feat: improved `monitor_events` [\#181](https://github.com/pyapp-kit/psygnal/pull/181) ([tlambert03](https://github.com/tlambert03)) - feat: make SignalGroupDescriptor public [\#173](https://github.com/pyapp-kit/psygnal/pull/173) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: fix inheritance of classes with a SignalGroupDescriptor [\#186](https://github.com/pyapp-kit/psygnal/pull/186) ([tlambert03](https://github.com/tlambert03)) - fix: minor typing fixes on `connect` [\#180](https://github.com/pyapp-kit/psygnal/pull/180) ([tlambert03](https://github.com/tlambert03)) - fix: add getattr to signalgroup for typing [\#174](https://github.com/pyapp-kit/psygnal/pull/174) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - ci: add dataclasses benchmarks [\#189](https://github.com/pyapp-kit/psygnal/pull/189) ([tlambert03](https://github.com/tlambert03)) - test: no cover compile funcs [\#185](https://github.com/pyapp-kit/psygnal/pull/185) ([tlambert03](https://github.com/tlambert03)) - ci: add evented benchmark [\#175](https://github.com/pyapp-kit/psygnal/pull/175) ([tlambert03](https://github.com/tlambert03)) - ci: add codspeed benchmarks [\#170](https://github.com/pyapp-kit/psygnal/pull/170) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - refactor: change patching of \_\_setattr\_\_ in SignalGroupDescriptor, make more explicit [\#188](https://github.com/pyapp-kit/psygnal/pull/188) ([tlambert03](https://github.com/tlambert03)) - docs: small docs updates, document EmissionLoopError [\#184](https://github.com/pyapp-kit/psygnal/pull/184) ([tlambert03](https://github.com/tlambert03)) - refactor: remove PSYGNAL\_UNCOMPILED flag. [\#183](https://github.com/pyapp-kit/psygnal/pull/183) ([tlambert03](https://github.com/tlambert03)) - docs: adding spellchecking to docs [\#182](https://github.com/pyapp-kit/psygnal/pull/182) ([tlambert03](https://github.com/tlambert03)) - docs: update evented docs to descript SignalGroupDescriptor [\#179](https://github.com/pyapp-kit/psygnal/pull/179) ([tlambert03](https://github.com/tlambert03)) - refactor: split out SlotCaller logic into new `weak_callable` module... maybe public eventually [\#178](https://github.com/pyapp-kit/psygnal/pull/178) ([tlambert03](https://github.com/tlambert03)) - refactor: split out dataclass utils [\#176](https://github.com/pyapp-kit/psygnal/pull/176) ([tlambert03](https://github.com/tlambert03)) - refactor: use weakmethod instead of \_get\_method\_name [\#168](https://github.com/pyapp-kit/psygnal/pull/168) ([tlambert03](https://github.com/tlambert03)) ## [v0.7.2](https://github.com/pyapp-kit/psygnal/tree/v0.7.2) (2023-02-11) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.7.1...v0.7.2) **Fixed bugs:** - fix: use weakref when instance is passed to SignalGroup [\#167](https://github.com/pyapp-kit/psygnal/pull/167) ([tlambert03](https://github.com/tlambert03)) ## [v0.7.1](https://github.com/pyapp-kit/psygnal/tree/v0.7.1) (2023-02-11) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.7.0...v0.7.1) **Implemented enhancements:** - feat: add `is_evented` and `get_evented_namespace` [\#166](https://github.com/pyapp-kit/psygnal/pull/166) ([tlambert03](https://github.com/tlambert03)) - feat: add support for msgspec Struct classes to evented decorator [\#165](https://github.com/pyapp-kit/psygnal/pull/165) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: fix clobbering of SignalGroup name in EventedModel [\#158](https://github.com/pyapp-kit/psygnal/pull/158) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - ci\(dependabot\): bump pypa/cibuildwheel from 2.11.4 to 2.12.0 [\#164](https://github.com/pyapp-kit/psygnal/pull/164) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.11.3 to 2.11.4 [\#159](https://github.com/pyapp-kit/psygnal/pull/159) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.7.0](https://github.com/pyapp-kit/psygnal/tree/v0.7.0) (2022-12-20) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.6.1...v0.7.0) **Implemented enhancements:** - build: use mypyc instead of cython, move to hatch [\#149](https://github.com/pyapp-kit/psygnal/pull/149) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix: add dataclass\_transform to maintain IDE typing support for EventedModel.\_\_init\_\_ [\#154](https://github.com/pyapp-kit/psygnal/pull/154) ([tlambert03](https://github.com/tlambert03)) - Don't unblock/resume within nested contexts [\#150](https://github.com/pyapp-kit/psygnal/pull/150) ([hanjinliu](https://github.com/hanjinliu)) **Merged pull requests:** - ci\(pre-commit.ci\): autoupdate [\#155](https://github.com/pyapp-kit/psygnal/pull/155) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.11.2 to 2.11.3 [\#153](https://github.com/pyapp-kit/psygnal/pull/153) ([dependabot[bot]](https://github.com/apps/dependabot)) - style: use ruff instead of flake8, isort, pyupgrade, autoflake, etc... [\#146](https://github.com/pyapp-kit/psygnal/pull/146) ([tlambert03](https://github.com/tlambert03)) - chore: add deps to setup.py [\#145](https://github.com/pyapp-kit/psygnal/pull/145) ([tlambert03](https://github.com/tlambert03)) - refactor: remove PartialMethodMeta for TypeGuard func [\#144](https://github.com/pyapp-kit/psygnal/pull/144) ([tlambert03](https://github.com/tlambert03)) - refactor: don't use metaclass for signal group [\#143](https://github.com/pyapp-kit/psygnal/pull/143) ([tlambert03](https://github.com/tlambert03)) ## [v0.6.1](https://github.com/pyapp-kit/psygnal/tree/v0.6.1) (2022-11-13) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.6.0.post0...v0.6.1) **Fixed bugs:** - fix: fix failed weakref in connect\_setattr [\#142](https://github.com/pyapp-kit/psygnal/pull/142) ([tlambert03](https://github.com/tlambert03)) - fix: fix disconnection of partials [\#134](https://github.com/pyapp-kit/psygnal/pull/134) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - chore: rename org to pyapp-kit [\#141](https://github.com/pyapp-kit/psygnal/pull/141) ([tlambert03](https://github.com/tlambert03)) ## [v0.6.0.post0](https://github.com/pyapp-kit/psygnal/tree/v0.6.0.post0) (2022-11-09) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.6.0...v0.6.0.post0) **Merged pull requests:** - build: unskip cibuildwheel py311 [\#140](https://github.com/pyapp-kit/psygnal/pull/140) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.11.1 to 2.11.2 [\#138](https://github.com/pyapp-kit/psygnal/pull/138) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.6.0](https://github.com/pyapp-kit/psygnal/tree/v0.6.0) (2022-10-29) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.5.0...v0.6.0) **Implemented enhancements:** - build: drop py3.7 add py3.11 [\#135](https://github.com/pyapp-kit/psygnal/pull/135) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - chore: changelog v0.6.0 [\#137](https://github.com/pyapp-kit/psygnal/pull/137) ([tlambert03](https://github.com/tlambert03)) - build: support 3.7 again [\#136](https://github.com/pyapp-kit/psygnal/pull/136) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.10.2 to 2.11.1 [\#133](https://github.com/pyapp-kit/psygnal/pull/133) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.5.0](https://github.com/pyapp-kit/psygnal/tree/v0.5.0) (2022-10-14) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.4.2...v0.5.0) **Implemented enhancements:** - feat: add warning for poor usage [\#132](https://github.com/pyapp-kit/psygnal/pull/132) ([tlambert03](https://github.com/tlambert03)) - feat: add `@evented` decorator, turn any dataclass, attrs model, or pydantic model into evented [\#129](https://github.com/pyapp-kit/psygnal/pull/129) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - docs: update readme [\#131](https://github.com/pyapp-kit/psygnal/pull/131) ([tlambert03](https://github.com/tlambert03)) - docs: documentation for evented decorator [\#130](https://github.com/pyapp-kit/psygnal/pull/130) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.10.1 to 2.10.2 [\#127](https://github.com/pyapp-kit/psygnal/pull/127) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.4.2](https://github.com/pyapp-kit/psygnal/tree/v0.4.2) (2022-09-25) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.4.1...v0.4.2) **Fixed bugs:** - fix: fix inheritance of property setters [\#126](https://github.com/pyapp-kit/psygnal/pull/126) ([tlambert03](https://github.com/tlambert03)) - fix: fix bug in setattr with private attrs [\#125](https://github.com/pyapp-kit/psygnal/pull/125) ([tlambert03](https://github.com/tlambert03)) ## [v0.4.1](https://github.com/pyapp-kit/psygnal/tree/v0.4.1) (2022-09-22) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.4.0...v0.4.1) **Implemented enhancements:** - feat: Add ability to disconnect slots from Signal group directly [\#118](https://github.com/pyapp-kit/psygnal/pull/118) ([alisterburt](https://github.com/alisterburt)) **Fixed bugs:** - fix: fix listevents docstring parameter mismatch [\#119](https://github.com/pyapp-kit/psygnal/pull/119) ([alisterburt](https://github.com/alisterburt)) **Tests & CI:** - ci: skip building py311 wheel [\#124](https://github.com/pyapp-kit/psygnal/pull/124) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - ci\(dependabot\): bump pypa/cibuildwheel from 2.9.0 to 2.10.1 [\#123](https://github.com/pyapp-kit/psygnal/pull/123) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump pypa/cibuildwheel from 2.8.1 to 2.9.0 [\#121](https://github.com/pyapp-kit/psygnal/pull/121) ([dependabot[bot]](https://github.com/apps/dependabot)) - build: pin cython [\#120](https://github.com/pyapp-kit/psygnal/pull/120) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump pypa/gh-action-pypi-publish from 1.5.0 to 1.5.1 [\#116](https://github.com/pyapp-kit/psygnal/pull/116) ([dependabot[bot]](https://github.com/apps/dependabot)) ## [v0.4.0](https://github.com/pyapp-kit/psygnal/tree/v0.4.0) (2022-07-26) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.3.5...v0.4.0) **Implemented enhancements:** - feat: raise exceptions as EmitLoopError [\#115](https://github.com/pyapp-kit/psygnal/pull/115) ([tlambert03](https://github.com/tlambert03)) - feat: add connect\_setitem [\#108](https://github.com/pyapp-kit/psygnal/pull/108) ([tlambert03](https://github.com/tlambert03)) - build: move entirely to pyproject, and src setup [\#101](https://github.com/pyapp-kit/psygnal/pull/101) ([tlambert03](https://github.com/tlambert03)) - add readthedocs config, make EventedCallableObjectProxy public [\#86](https://github.com/pyapp-kit/psygnal/pull/86) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - refactor: guard paramspec import [\#112](https://github.com/pyapp-kit/psygnal/pull/112) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - replace docs/requirements with extra, fix rtd install [\#87](https://github.com/pyapp-kit/psygnal/pull/87) ([tlambert03](https://github.com/tlambert03)) ## [v0.3.5](https://github.com/pyapp-kit/psygnal/tree/v0.3.5) (2022-05-25) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.3.4...v0.3.5) **Merged pull requests:** - \[pre-commit.ci\] pre-commit autoupdate [\#85](https://github.com/pyapp-kit/psygnal/pull/85) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) - Add documentation [\#84](https://github.com/pyapp-kit/psygnal/pull/84) ([tlambert03](https://github.com/tlambert03)) - Evented pydantic model [\#83](https://github.com/pyapp-kit/psygnal/pull/83) ([tlambert03](https://github.com/tlambert03)) - \[pre-commit.ci\] pre-commit autoupdate [\#82](https://github.com/pyapp-kit/psygnal/pull/82) ([pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci)) ## [v0.3.4](https://github.com/pyapp-kit/psygnal/tree/v0.3.4) (2022-05-02) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.3.3...v0.3.4) **Implemented enhancements:** - Add `EventedDict` [\#79](https://github.com/pyapp-kit/psygnal/pull/79) ([alisterburt](https://github.com/alisterburt)) - add `SelectableEventedList` [\#78](https://github.com/pyapp-kit/psygnal/pull/78) ([alisterburt](https://github.com/alisterburt)) - Add Throttler class [\#75](https://github.com/pyapp-kit/psygnal/pull/75) ([tlambert03](https://github.com/tlambert03)) - Add Selection model ported from napari [\#64](https://github.com/pyapp-kit/psygnal/pull/64) ([alisterburt](https://github.com/alisterburt)) **Fixed bugs:** - Make SignalInstance weak referenceable \(Fix forwarding signals\) [\#71](https://github.com/pyapp-kit/psygnal/pull/71) ([tlambert03](https://github.com/tlambert03)) ## [v0.3.3](https://github.com/pyapp-kit/psygnal/tree/v0.3.3) (2022-02-14) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.3.2...v0.3.3) **Fixed bugs:** - Used custom tuple for cython compatibility [\#69](https://github.com/pyapp-kit/psygnal/pull/69) ([tlambert03](https://github.com/tlambert03)) ## [v0.3.2](https://github.com/pyapp-kit/psygnal/tree/v0.3.2) (2022-02-14) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.3.1...v0.3.2) **Implemented enhancements:** - work with older cython [\#67](https://github.com/pyapp-kit/psygnal/pull/67) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - generate gh release in CI [\#68](https://github.com/pyapp-kit/psygnal/pull/68) ([tlambert03](https://github.com/tlambert03)) ## [v0.3.1](https://github.com/pyapp-kit/psygnal/tree/v0.3.1) (2022-02-12) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.3.0...v0.3.1) **Fixed bugs:** - Don't use `repr(obj)` when checking for Qt emit signature [\#66](https://github.com/pyapp-kit/psygnal/pull/66) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - add a magicgui test to CI [\#65](https://github.com/pyapp-kit/psygnal/pull/65) ([tlambert03](https://github.com/tlambert03)) - skip cibuildwheel tests on musllinux and i686 [\#63](https://github.com/pyapp-kit/psygnal/pull/63) ([tlambert03](https://github.com/tlambert03)) ## [v0.3.0](https://github.com/pyapp-kit/psygnal/tree/v0.3.0) (2022-02-10) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.2.0...v0.3.0) **Implemented enhancements:** - Add EventedObjectProxy [\#62](https://github.com/pyapp-kit/psygnal/pull/62) ([tlambert03](https://github.com/tlambert03)) - Misc small changes, add iter\_signal\_instances to utils [\#61](https://github.com/pyapp-kit/psygnal/pull/61) ([tlambert03](https://github.com/tlambert03)) - Add EventedSet and EventedOrderedSet [\#59](https://github.com/pyapp-kit/psygnal/pull/59) ([tlambert03](https://github.com/tlambert03)) - add SignalGroup blocked context manager, improve inheritance, and fix strong refs [\#57](https://github.com/pyapp-kit/psygnal/pull/57) ([tlambert03](https://github.com/tlambert03)) - Add evented list \(more evented containers coming\) [\#56](https://github.com/pyapp-kit/psygnal/pull/56) ([tlambert03](https://github.com/tlambert03)) - add debug\_events util \(later changed to `monitor_events`\) [\#55](https://github.com/pyapp-kit/psygnal/pull/55) ([tlambert03](https://github.com/tlambert03)) - support Qt SignalInstance Emit [\#49](https://github.com/pyapp-kit/psygnal/pull/49) ([tlambert03](https://github.com/tlambert03)) - Add SignalGroup [\#42](https://github.com/pyapp-kit/psygnal/pull/42) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - add typesafety tests to evented containers [\#60](https://github.com/pyapp-kit/psygnal/pull/60) ([tlambert03](https://github.com/tlambert03)) - deal with changing API in benchmarks [\#43](https://github.com/pyapp-kit/psygnal/pull/43) ([tlambert03](https://github.com/tlambert03)) ## [v0.2.0](https://github.com/pyapp-kit/psygnal/tree/v0.2.0) (2021-11-07) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.1.4...v0.2.0) **Implemented enhancements:** - Add `connect/disconnect_settattr` [\#39](https://github.com/pyapp-kit/psygnal/pull/39) ([tlambert03](https://github.com/tlambert03)) - Enable uncompiled import with PSYGNAL\_UNCOMPILED env var [\#33](https://github.com/pyapp-kit/psygnal/pull/33) ([tlambert03](https://github.com/tlambert03)) - Add asv benchmark to CI [\#31](https://github.com/pyapp-kit/psygnal/pull/31) ([tlambert03](https://github.com/tlambert03)) - Avoid holding strong reference to decorated and partial methods [\#29](https://github.com/pyapp-kit/psygnal/pull/29) ([Czaki](https://github.com/Czaki)) - Change confusing variable name in \_acceptable\_posarg\_range [\#25](https://github.com/pyapp-kit/psygnal/pull/25) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - Set SignalInstances directly as attributes on objects \(fix bug with hashable signal holders\) [\#28](https://github.com/pyapp-kit/psygnal/pull/28) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - Add benchmarks for connect\_setattr [\#41](https://github.com/pyapp-kit/psygnal/pull/41) ([Czaki](https://github.com/Czaki)) - Extend emit benchmarks to include methods [\#40](https://github.com/pyapp-kit/psygnal/pull/40) ([tlambert03](https://github.com/tlambert03)) - Fix codecov CI and bring coverage back to 100 [\#34](https://github.com/pyapp-kit/psygnal/pull/34) ([tlambert03](https://github.com/tlambert03)) - Change benchmark publication approach [\#32](https://github.com/pyapp-kit/psygnal/pull/32) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - Misc-typing and minor reorg [\#35](https://github.com/pyapp-kit/psygnal/pull/35) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.4](https://github.com/pyapp-kit/psygnal/tree/v0.1.4) (2021-10-17) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.1.3...v0.1.4) **Implemented enhancements:** - support python 3.10 [\#24](https://github.com/pyapp-kit/psygnal/pull/24) ([tlambert03](https://github.com/tlambert03)) - Add ability to pause & resume/reduce signals [\#23](https://github.com/pyapp-kit/psygnal/pull/23) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.3](https://github.com/pyapp-kit/psygnal/tree/v0.1.3) (2021-10-01) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.1.2...v0.1.3) **Implemented enhancements:** - add \_\_call\_\_ as alias for `emit` on SignalInstance [\#18](https://github.com/pyapp-kit/psygnal/pull/18) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.2](https://github.com/pyapp-kit/psygnal/tree/v0.1.2) (2021-07-12) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.1.1...v0.1.2) **Implemented enhancements:** - Provide signatures for common builtins [\#7](https://github.com/pyapp-kit/psygnal/pull/7) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - Add more typing tests [\#9](https://github.com/pyapp-kit/psygnal/pull/9) ([tlambert03](https://github.com/tlambert03)) - test working with qtbot [\#8](https://github.com/pyapp-kit/psygnal/pull/8) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.1](https://github.com/pyapp-kit/psygnal/tree/v0.1.1) (2021-07-07) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/v0.1.0...v0.1.1) **Implemented enhancements:** - connect decorator, optional args [\#5](https://github.com/pyapp-kit/psygnal/pull/5) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - Catch inspection failures on connect \(e.g. `print`\), and improve maxargs syntax [\#6](https://github.com/pyapp-kit/psygnal/pull/6) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.0](https://github.com/pyapp-kit/psygnal/tree/v0.1.0) (2021-07-06) [Full Changelog](https://github.com/pyapp-kit/psygnal/compare/bd037d2cb3cdc1c9423fd7d88ac6edfdd40f39d9...v0.1.0) **Implemented enhancements:** - Add readme, add `@connect` decorator [\#3](https://github.com/pyapp-kit/psygnal/pull/3) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - fix ci [\#2](https://github.com/pyapp-kit/psygnal/pull/2) ([tlambert03](https://github.com/tlambert03)) - ci [\#1](https://github.com/pyapp-kit/psygnal/pull/1) ([tlambert03](https://github.com/tlambert03)) \* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)* psygnal-0.15.0/src/psygnal/__init__.py0000644000000000000000000000363215073705675014600 0ustar00"""Psygnal implements the observer pattern for Python. It emulates the signal/slot pattern from Qt, but it does not require Qt. """ import os from importlib.metadata import PackageNotFoundError, version from typing import TYPE_CHECKING, Any if TYPE_CHECKING: from ._evented_model import EventedModel try: __version__ = version("psygnal") except PackageNotFoundError: # pragma: no cover __version__ = "0.0.0" __author__ = "Talley Lambert" __email__ = "talley.lambert@gmail.com" __all__ = [ "EmissionInfo", "EmitLoopError", "EventedModel", "PathStep", "Signal", "SignalGroup", "SignalGroupDescriptor", "SignalInstance", "__version__", "_compiled", "debounced", "emit_queued", "evented", "get_async_backend", "get_evented_namespace", "is_evented", "set_async_backend", "throttled", ] if os.getenv("PSYGNAL_UNCOMPILED"): import warnings warnings.warn( "PSYGNAL_UNCOMPILED no longer has any effect. If you wish to run psygnal " "without compiled files, you can run:\n\n" 'python -c "import psygnal.utils; psygnal.utils.decompile()"\n\n' "(You will need to reinstall psygnal to get the compiled version back.)", stacklevel=2, ) from ._async import get_async_backend, set_async_backend from ._evented_decorator import evented from ._exceptions import EmitLoopError from ._group import EmissionInfo, PathStep, SignalGroup from ._group_descriptor import SignalGroupDescriptor, get_evented_namespace, is_evented from ._queue import emit_queued from ._signal import Signal, SignalInstance, _compiled from ._throttler import debounced, throttled def __getattr__(name: str) -> Any: # pragma: no cover if name == "EventedModel": from ._evented_model import EventedModel return EventedModel raise AttributeError(f"module {__name__!r} has no attribute {name!r}") del os, TYPE_CHECKING psygnal-0.15.0/src/psygnal/_async.py0000644000000000000000000001701315073705675014313 0ustar00from __future__ import annotations from abc import ABC, abstractmethod from math import inf from typing import TYPE_CHECKING, overload if TYPE_CHECKING: from collections.abc import Coroutine from typing import Any, Literal, Protocol, TypeAlias import anyio.streams.memory import trio from psygnal._weak_callback import WeakCallback SupportedBackend: TypeAlias = Literal["asyncio", "anyio", "trio"] QueueItem: TypeAlias = tuple["WeakCallback", tuple[Any, ...]] class EventLike(Protocol): def is_set(self) -> bool: """Return ``True`` if the flag is set, ``False`` if not.""" ... async def wait(self) -> Coroutine | bool | None: """Wait until the flag is set.""" ... _ASYNC_BACKEND: _AsyncBackend | None = None def get_async_backend() -> _AsyncBackend | None: """Get the current async backend. Returns None if no backend is set.""" return _ASYNC_BACKEND def clear_async_backend() -> None: """Clear the current async backend. Primarily for testing purposes.""" global _ASYNC_BACKEND if _ASYNC_BACKEND is not None: # Cancel any running tasks if it's asyncio and loop is not closed if isinstance(_ASYNC_BACKEND, AsyncioBackend): _ASYNC_BACKEND.close() # Close anyio streams elif isinstance(_ASYNC_BACKEND, AnyioBackend): _ASYNC_BACKEND.close() # Close trio channels elif isinstance(_ASYNC_BACKEND, TrioBackend): if hasattr(_ASYNC_BACKEND, "_send_channel"): _ASYNC_BACKEND._send_channel.close() # Note: trio receive channels don't have a close method _ASYNC_BACKEND = None @overload def set_async_backend(backend: Literal["asyncio"]) -> AsyncioBackend: ... @overload def set_async_backend(backend: Literal["anyio"]) -> AnyioBackend: ... @overload def set_async_backend(backend: Literal["trio"]) -> TrioBackend: ... def set_async_backend(backend: SupportedBackend = "asyncio") -> _AsyncBackend: """Set the async backend to use. Must be one of: 'asyncio', 'anyio', 'trio'. This should be done as early as possible, and *must* be called before calling `SignalInstance.connect` with a coroutine function. """ global _ASYNC_BACKEND if _ASYNC_BACKEND and _ASYNC_BACKEND._backend != backend: # pragma: no cover # allow setting the same backend multiple times, for tests raise RuntimeError(f"Async backend already set to: {_ASYNC_BACKEND._backend}") if backend == "asyncio": _ASYNC_BACKEND = AsyncioBackend() elif backend == "anyio": _ASYNC_BACKEND = AnyioBackend() elif backend == "trio": _ASYNC_BACKEND = TrioBackend() else: # pragma: no cover raise RuntimeError( f"Async backend not supported: {backend}. " "Must be one of: 'asyncio', 'anyio', 'trio'" ) return _ASYNC_BACKEND class _AsyncBackend(ABC): def __init__(self, backend: str): self._backend = backend @property @abstractmethod def running(self) -> EventLike: ... @abstractmethod def put(self, item: QueueItem) -> None: ... @abstractmethod async def run(self) -> None: ... async def call_back(self, item: QueueItem) -> None: cb, args = item if func := cb.dereference(): await func(*args) class AsyncioBackend(_AsyncBackend): def __init__(self) -> None: super().__init__("asyncio") import asyncio self._asyncio = asyncio self._queue: asyncio.Queue[tuple] = asyncio.Queue() self._task = asyncio.create_task(self.run()) self._loop = asyncio.get_running_loop() self._running = asyncio.Event() @property def running(self) -> EventLike: """Return the event indicating if the backend is running.""" return self._running def put(self, item: QueueItem) -> None: self._queue.put_nowait(item) def close(self) -> None: """Close the asyncio backend and cancel tasks.""" if hasattr(self, "_task") and not self._task.done(): self._task.cancel() async def run(self) -> None: if self._running.is_set(): return self._running.set() try: while True: item = await self._queue.get() try: await self.call_back(item) except Exception: # Log the exception but continue running # This prevents one bad callback from crashing the backend import traceback traceback.print_exc() except self._asyncio.CancelledError: pass except RuntimeError as e: # pragma: no cover if not self._loop.is_closed(): raise e finally: self._running.clear() class AnyioBackend(_AsyncBackend): _send_stream: anyio.streams.memory.MemoryObjectSendStream[QueueItem] _receive_stream: anyio.streams.memory.MemoryObjectReceiveStream[QueueItem] def __init__(self) -> None: super().__init__("anyio") import anyio self._anyio = anyio self._send_stream, self._receive_stream = anyio.create_memory_object_stream( max_buffer_size=inf ) self._running = anyio.Event() @property def running(self) -> EventLike: """Return the event indicating if the backend is running.""" return self._running def put(self, item: QueueItem) -> None: self._send_stream.send_nowait(item) def close(self) -> None: """Close the anyio streams.""" if hasattr(self, "_send_stream"): self._send_stream.close() if hasattr(self, "_receive_stream"): self._receive_stream.close() async def run(self) -> None: if self._running.is_set(): return # pragma: no cover self._running.set() try: async with self._receive_stream: async for item in self._receive_stream: try: await self.call_back(item) except Exception: # Log the exception but continue running import traceback traceback.print_exc() finally: self._running = self._anyio.Event() # Ensure streams are closed self.close() class TrioBackend(_AsyncBackend): _send_channel: trio._channel.MemorySendChannel[QueueItem] _receive_channel: trio.abc.ReceiveChannel[QueueItem] def __init__(self) -> None: super().__init__("trio") import trio self._trio = trio self._send_channel, self._receive_channel = trio.open_memory_channel( max_buffer_size=inf ) self._running = self._trio.Event() @property def running(self) -> EventLike: """Return the event indicating if the backend is running.""" return self._running def put(self, item: tuple) -> None: self._send_channel.send_nowait(item) async def run(self) -> None: if self._running.is_set(): return # pragma: no cover self._running.set() try: async for item in self._receive_channel: try: await self.call_back(item) except Exception: # Log the exception but continue running import traceback traceback.print_exc() finally: self._running = self._trio.Event() psygnal-0.15.0/src/psygnal/_dataclass_utils.py0000644000000000000000000001243015073705675016353 0ustar00from __future__ import annotations import contextlib import dataclasses import sys from types import GenericAlias from typing import TYPE_CHECKING, Any, Protocol, cast, overload if TYPE_CHECKING: from collections.abc import Iterator from typing import TypeGuard import attrs import msgspec from pydantic import BaseModel class _DataclassParams(Protocol): init: bool repr: bool eq: bool order: bool unsafe_hash: bool frozen: bool class AttrsType: __attrs_attrs__: tuple[attrs.Attribute, ...] _DATACLASS_PARAMS = "__dataclass_params__" with contextlib.suppress(ImportError): from dataclasses import _DATACLASS_PARAMS # type: ignore _DATACLASS_FIELDS = "__dataclass_fields__" with contextlib.suppress(ImportError): from dataclasses import _DATACLASS_FIELDS # type: ignore class DataClassType: __dataclass_params__: _DataclassParams __dataclass_fields__: dict[str, dataclasses.Field] @overload def is_dataclass(obj: type) -> TypeGuard[type[DataClassType]]: ... @overload def is_dataclass(obj: object) -> TypeGuard[DataClassType]: ... def is_dataclass(obj: object) -> TypeGuard[DataClassType]: """Return True if the object is a dataclass.""" cls = ( obj if isinstance(obj, type) and not isinstance(obj, GenericAlias) else type(obj) ) return hasattr(cls, _DATACLASS_FIELDS) @overload def is_attrs_class(obj: type) -> TypeGuard[type[AttrsType]]: ... @overload def is_attrs_class(obj: object) -> TypeGuard[AttrsType]: ... def is_attrs_class(obj: object) -> TypeGuard[type[AttrsType]]: """Return True if the class is an attrs class.""" attr = sys.modules.get("attr", None) cls = obj if isinstance(obj, type) else type(obj) return attr.has(cls) if attr is not None else False @overload def is_pydantic_model(obj: type) -> TypeGuard[type[BaseModel]]: ... @overload def is_pydantic_model(obj: object) -> TypeGuard[BaseModel]: ... def is_pydantic_model(obj: object) -> TypeGuard[BaseModel]: """Return True if the class is a pydantic BaseModel.""" pydantic = sys.modules.get("pydantic", None) cls = obj if isinstance(obj, type) else type(obj) return pydantic is not None and issubclass(cls, pydantic.BaseModel) @overload def is_msgspec_struct(obj: type) -> TypeGuard[type[msgspec.Struct]]: ... @overload def is_msgspec_struct(obj: object) -> TypeGuard[msgspec.Struct]: ... def is_msgspec_struct(obj: object) -> TypeGuard[msgspec.Struct]: """Return True if the class is a `msgspec.Struct`.""" msgspec = sys.modules.get("msgspec", None) cls = obj if isinstance(obj, type) else type(obj) return msgspec is not None and issubclass(cls, msgspec.Struct) def is_frozen(obj: Any) -> bool: """Return True if the object is frozen.""" # sourcery skip: reintroduce-else cls = obj if isinstance(obj, type) else type(obj) params = cast("_DataclassParams | None", getattr(cls, _DATACLASS_PARAMS, None)) if params is not None: return params.frozen # pydantic cfg = getattr(cls, "__config__", None) if cfg is not None and getattr(cfg, "allow_mutation", None) is False: return True # pydantic v2 cfg = getattr(cls, "model_config", None) if cfg is not None and cfg.get("frozen"): return True # attrs if getattr(cls.__setattr__, "__name__", None) == "_frozen_setattrs": return True cfg = getattr(cls, "__struct_config__", None) if cfg is not None: # pragma: no cover # this will be covered in msgspec > 0.13.1 return bool(getattr(cfg, "frozen", False)) return False def iter_fields( cls: type, exclude_frozen: bool = True ) -> Iterator[tuple[str, type | None]]: """Iterate over all fields in the class, including inherited fields. This function recognizes dataclasses, attrs classes, msgspec Structs, and pydantic models. Parameters ---------- cls : type The class to iterate over. exclude_frozen : bool, optional If True, frozen fields will be excluded. By default True. Yields ------ tuple[str, type | None] The name and type of each field. """ # generally opting for speed here over public API if (dclass_fields := getattr(cls, "__dataclass_fields__", None)) is not None: for d_field in dclass_fields.values(): if d_field._field_type is dataclasses._FIELD: # type: ignore [attr-defined] yield d_field.name, d_field.type return if is_pydantic_model(cls): if hasattr(cls, "model_fields"): for field_name, p_field in cls.model_fields.items(): if not p_field.frozen or not exclude_frozen: yield field_name, p_field.annotation else: for p_field in cls.__fields__.values(): # type: ignore [attr-defined] if p_field.field_info.allow_mutation or not exclude_frozen: yield p_field.name, p_field.outer_type_ return if (attrs_fields := getattr(cls, "__attrs_attrs__", None)) is not None: for a_field in attrs_fields: yield a_field.name, a_field.type return if is_msgspec_struct(cls): for m_field in cls.__struct_fields__: type_ = cls.__annotations__.get(m_field, None) yield m_field, type_ return psygnal-0.15.0/src/psygnal/_evented_decorator.py0000644000000000000000000001340615073705675016674 0ustar00from __future__ import annotations from typing import TYPE_CHECKING, Literal, TypeVar, overload from psygnal._group_descriptor import SignalGroupDescriptor if TYPE_CHECKING: from collections.abc import Callable, Mapping from psygnal._group_descriptor import EqOperator, FieldAliasFunc __all__ = ["evented"] T = TypeVar("T", bound=type) @overload def evented( cls: T, *, events_namespace: str = "events", equality_operators: dict[str, EqOperator] | None = None, warn_on_no_fields: bool = ..., cache_on_instance: bool = ..., connect_child_events: bool = ..., signal_aliases: Mapping[str, str | None] | FieldAliasFunc | None = ..., ) -> T: ... @overload def evented( cls: Literal[None] | None = None, *, events_namespace: str = "events", equality_operators: dict[str, EqOperator] | None = None, warn_on_no_fields: bool = ..., cache_on_instance: bool = ..., connect_child_events: bool = ..., signal_aliases: Mapping[str, str | None] | FieldAliasFunc | None = ..., ) -> Callable[[T], T]: ... def evented( cls: T | None = None, *, events_namespace: str = "events", equality_operators: dict[str, EqOperator] | None = None, warn_on_no_fields: bool = True, cache_on_instance: bool = True, connect_child_events: bool = True, signal_aliases: Mapping[str, str | None] | FieldAliasFunc | None = None, ) -> Callable[[T], T] | T: """A decorator to add events to a dataclass. See also the documentation for [`SignalGroupDescriptor`][psygnal.SignalGroupDescriptor]. This decorator is equivalent setting a class variable named `events` to a new `SignalGroupDescriptor` instance. Note that this decorator will modify `cls` *in place*, as well as return it. !!!tip It is recommended to use the `SignalGroupDescriptor` descriptor rather than the decorator, as it it is more explicit and provides for easier static type inference. Parameters ---------- cls : type The class to decorate. events_namespace : str The name of the namespace to add the events to, by default `"events"` equality_operators : dict[str, Callable] | None A dictionary mapping field names to equality operators (a function that takes two values and returns `True` if they are equal). These will be used to determine if a field has changed when setting a new value. By default, this will use the `__eq__` method of the field type, or np.array_equal, for numpy arrays. But you can provide your own if you want to customize how equality is checked. Alternatively, if the class has an `__eq_operators__` class attribute, it will be used. warn_on_no_fields : bool If `True` (the default), a warning will be emitted if no mutable dataclass-like fields are found on the object. cache_on_instance : bool, optional If `True` (the default), a newly-created SignalGroup instance will be cached on the instance itself, so that subsequent accesses to the descriptor will return the same SignalGroup instance. This makes for slightly faster subsequent access, but means that the owner instance will no longer be pickleable. If `False`, the SignalGroup instance will *still* be cached, but not on the instance itself. connect_child_events : bool, optional If `True`, will connect events from all fields on the dataclass whose type is also "evented" (as determined by the `psygnal.is_evented` function, which returns True if the class has been decorated with `@evented`, or if it has a SignalGroupDescriptor) to the group on the parent object. By default True. This is useful for nested evented dataclasses, where you want to monitor events emitted from arbitrarily deep children on the parent object. signal_aliases: Mapping[str, str | None] | Callable[[str], str | None] | None If defined, a mapping between field name and signal name. Field names that are not `signal_aliases` keys are not aliased (the signal name is the field name). If the dict value is None, do not create a signal associated with this field. If a callable, the signal name is the output of the function applied to the field name. If the output is None, no signal is created for this field. If None, defaults to an empty dict, no aliases. Default to None Returns ------- type The decorated class, which gains a new SignalGroup instance at the `events_namespace` attribute (by default, `events`). Raises ------ TypeError If the class is frozen or is not a class. Examples -------- ```python from psygnal import evented from dataclasses import dataclass @evented @dataclass class Person: name: str age: int = 0 ``` """ def _decorate(cls: T) -> T: if not isinstance(cls, type): # pragma: no cover raise TypeError("evented can only be used on classes") if any(k.startswith("_psygnal") for k in getattr(cls, "__annotations__", {})): raise TypeError("Fields on an evented class cannot start with '_psygnal'") descriptor: SignalGroupDescriptor = SignalGroupDescriptor( equality_operators=equality_operators, warn_on_no_fields=warn_on_no_fields, cache_on_instance=cache_on_instance, connect_child_events=connect_child_events, signal_aliases=signal_aliases, ) # as a decorator, this will have already been called descriptor.__set_name__(cls, events_namespace) setattr(cls, events_namespace, descriptor) return cls return _decorate(cls) if cls is not None else _decorate psygnal-0.15.0/src/psygnal/_evented_model.py0000644000000000000000000007620115073705675016014 0ustar00import sys import warnings from collections.abc import Callable, Iterator, Mapping from contextlib import contextmanager, suppress from typing import ( TYPE_CHECKING, Any, ClassVar, NamedTuple, Union, cast, no_type_check, ) import pydantic from pydantic import PrivateAttr from ._group import SignalGroup from ._group_descriptor import _check_field_equality, _pick_equality_operator from ._signal import ReemissionMode, Signal PYDANTIC_V1 = pydantic.version.VERSION.startswith("1") if TYPE_CHECKING: from inspect import Signature from typing import TypeGuard from pydantic import ConfigDict from pydantic._internal import _model_construction as pydantic_main from pydantic._internal import _utils as utils from pydantic._internal._decorators import PydanticDescriptorProxy from typing_extensions import dataclass_transform as dataclass_transform # py311 from ._signal import SignalInstance EqOperator = Callable[[Any, Any], bool] else: if PYDANTIC_V1: import pydantic.main as pydantic_main from pydantic import utils else: from pydantic._internal import _model_construction as pydantic_main from pydantic._internal import _utils as utils try: # py311 from typing_extensions import dataclass_transform except ImportError: # pragma: no cover def dataclass_transform(*args, **kwargs): return lambda a: a NULL = object() ALLOW_PROPERTY_SETTERS = "allow_property_setters" FIELD_DEPENDENCIES = "field_dependencies" GUESS_PROPERTY_DEPENDENCIES = "guess_property_dependencies" REEMISSION = "reemission" @contextmanager def no_class_attributes() -> Iterator[None]: # pragma: no cover """Context in which pydantic_main.ClassAttribute just passes value 2. Due to a very annoying decision by PySide2, all class ``__signature__`` attributes may only be assigned **once**. (This seems to be regardless of whether the class has anything to do with PySide2 or not). Furthermore, the PySide2 ``__signature__`` attribute seems to break the python descriptor protocol, which means that class attributes that have a ``__get__`` method will not be able to successfully retrieve their value (instead, the descriptor object itself will be accessed). This plays terribly with Pydantic, which assigns a ``ClassAttribute`` object to the value of ``cls.__signature__`` in ``ModelMetaclass.__new__`` in order to avoid masking the call signature of object instances that have a ``__call__`` method (https://github.com/samuelcolvin/pydantic/pull/1466). So, because we only get to set the ``__signature__`` once, this context manager basically "opts-out" of pydantic's ``ClassAttribute`` strategy, thereby directly setting the ``cls.__signature__`` to an instance of ``inspect.Signature``. For additional context, see: - https://github.com/napari/napari/issues/2264 - https://github.com/napari/napari/pull/2265 - https://bugreports.qt.io/browse/PYSIDE-1004 - https://codereview.qt-project.org/c/pyside/pyside-setup/+/261411 """ if "PySide2" not in sys.modules: yield return # monkey patch the pydantic ClassAttribute object # the second argument to ClassAttribute is the inspect.Signature object def _return2(x: str, y: "Signature") -> "Signature": return y pydantic_main.ClassAttribute = _return2 # type: ignore try: yield finally: # undo our monkey patch pydantic_main.ClassAttribute = utils.ClassAttribute # type: ignore if not PYDANTIC_V1: def _get_defaults( obj: pydantic.BaseModel | type[pydantic.BaseModel], ) -> dict[str, Any]: """Get possibly nested default values for a Model object.""" dflt = {} cls = obj if isinstance(obj, type) else type(obj) for k, v in cls.model_fields.items(): d = v.get_default() if ( d is None and isinstance(v.annotation, type) and issubclass(v.annotation, pydantic.BaseModel) ): d = _get_defaults(v.annotation) # pragma: no cover dflt[k] = d return dflt def _get_config(cls: pydantic.BaseModel) -> "ConfigDict": return cls.model_config def _get_fields( cls: type[pydantic.BaseModel], ) -> dict[str, pydantic.fields.FieldInfo]: comp_fields = { name: pydantic.fields.FieldInfo(annotation=f.return_type, frozen=False) for name, f in cls.model_computed_fields.items() } return {**cls.model_fields, **comp_fields} def _model_dump(obj: pydantic.BaseModel) -> dict: return obj.model_dump() def _is_pydantic_descriptor_proxy(obj: Any) -> "TypeGuard[PydanticDescriptorProxy]": if ( type(obj).__module__.startswith("pydantic") and type(obj).__name__ == "PydanticDescriptorProxy" and isinstance(getattr(obj, "wrapped", None), property) ): return True return False else: @no_type_check def _get_defaults(obj: pydantic.BaseModel) -> dict[str, Any]: """Get possibly nested default values for a Model object.""" dflt = {} for k, v in obj.__fields__.items(): d = v.get_default() if d is None and isinstance(v.type_, pydantic_main.ModelMetaclass): d = _get_defaults(v.type_) # pragma: no cover dflt[k] = d return dflt class GetAttrAsItem: def __init__(self, obj: Any) -> None: self._obj = obj def get(self, key: str, default: Any = None) -> Any: return getattr(self._obj, key, default) @no_type_check def _get_config(cls: type) -> "ConfigDict": return GetAttrAsItem(cls.__config__) class FieldInfo(NamedTuple): annotation: type[Any] | None frozen: bool | None @no_type_check def _get_fields(cls: type) -> dict[str, FieldInfo]: return { k: FieldInfo(annotation=f.type_, frozen=not f.field_info.allow_mutation) for k, f in cls.__fields__.items() } def _model_dump(obj: pydantic.BaseModel) -> dict: return obj.dict() def _is_pydantic_descriptor_proxy(obj: Any) -> "TypeGuard[PydanticDescriptorProxy]": return False class ComparisonDelayer: """Context that delays before/after comparisons until exit.""" def __init__(self, target: "EventedModel") -> None: self._target = target def __enter__(self) -> None: self._target._delay_check_semaphore += 1 def __exit__(self, *_: Any, **__: Any) -> None: self._target._delay_check_semaphore -= 1 self._target._check_if_values_changed_and_emit_if_needed() class EventedMetaclass(pydantic_main.ModelMetaclass): """pydantic ModelMetaclass that preps "equality checking" operations. A metaclass is the thing that "constructs" a class, and ``ModelMetaclass`` is where pydantic puts a lot of it's type introspection and ``ModelField`` creation logic. Here, we simply tack on one more function, that builds a ``cls.__eq_operators__`` dict which is mapping of field name to a function that can be called to check equality of the value of that field with some other object. (used in ``EventedModel.__eq__``) This happens only once, when an ``EventedModel`` class is created (and not when each instance of an ``EventedModel`` is instantiated). """ __property_setters__: dict[str, property] @no_type_check def __new__( mcs: type, name: str, bases: tuple, namespace: dict, **kwargs: Any ) -> "EventedMetaclass": """Create new EventedModel class.""" with no_class_attributes(): cls = super().__new__(mcs, name, bases, namespace, **kwargs) cls.__eq_operators__ = {} signals = {} model_fields = _get_fields(cls) model_config = _get_config(cls) emission_cfg = model_config.get(REEMISSION, {}) default_strategy: ReemissionMode = ReemissionMode.LATEST emission_map: Mapping[str, ReemissionMode] = {} if isinstance(emission_cfg, (str, ReemissionMode)): default_strategy = ReemissionMode.validate(emission_cfg) else: try: emission_map = { k: ReemissionMode.validate(v) for k, v in emission_cfg.items() } except (ValueError, TypeError) as e: valid = ", ".join(repr(x) for x in ReemissionMode._members()) raise ValueError( f"Invalid reemission value {emission_cfg!r}. Must be a mapping " f"of field names to one of {valid}." ) from e for n, f in model_fields.items(): cls.__eq_operators__[n] = _pick_equality_operator(f.annotation) if not f.frozen: recursion = emission_map.get(n, default_strategy) signals[n] = Signal(f.annotation, reemission=recursion) # If a field type has a _json_encode method, add it to the json # encoders for this model. # NOTE: a _json_encode field must return an object that can be # passed to json.dumps ... but it needn't return a string. if PYDANTIC_V1 and hasattr(f.annotation, "_json_encode"): encoder = f.annotation._json_encode cls.__config__.json_encoders[f.annotation] = encoder # also add it to the base config # required for pydantic>=1.8.0 due to: # https://github.com/samuelcolvin/pydantic/pull/2064 for base in cls.__bases__: if hasattr(base, "__config__"): base.__config__.json_encoders[f.annotation] = encoder allow_props = model_config.get(ALLOW_PROPERTY_SETTERS, False) # check for @_.setters defined on the class, so we can allow them # in EventedModel.__setattr__ cls.__property_setters__ = {} if allow_props: # inherit property setters from base classes for b in reversed(cls.__bases__): if hasattr(b, "__property_setters__"): cls.__property_setters__.update(b.__property_setters__) # add property setters from this class for key, attr in namespace.items(): if _is_pydantic_descriptor_proxy(attr): attr = attr.wrapped if isinstance(attr, property) and attr.fset is not None: cls.__property_setters__[key] = attr recursion = emission_map.get(key, default_strategy) signals[key] = Signal(object, reemission=recursion) else: for b in cls.__bases__: with suppress(AttributeError): conf = _get_config(b) if conf and conf.get(ALLOW_PROPERTY_SETTERS, False): raise ValueError( "Cannot set 'allow_property_setters' to 'False' when base " f"class {b} sets it to True" ) cls.__field_dependents__ = _get_field_dependents( cls, model_config, model_fields ) cls.__signal_group__ = type(f"{name}SignalGroup", (SignalGroup,), signals) if not cls.__field_dependents__ and hasattr(cls, "_setattr_no_dependants"): cls._setattr_default = cls._setattr_no_dependants elif hasattr(cls, "_setattr_with_dependents"): cls._setattr_default = cls._setattr_with_dependents return cls def _get_field_dependents( cls: "EventedMetaclass", model_config: dict, model_fields: dict ) -> dict[str, set[str]]: """Return mapping of field name -> dependent set of property names. Dependencies may be declared in the Model Config to emit an event for a computed property when a model field that it depends on changes e.g. (@property 'c' depends on model fields 'a' and 'b') Examples -------- class MyModel(EventedModel): a: int = 1 b: int = 1 @property def c(self) -> List[int]: return [self.a, self.b] @c.setter def c(self, val: Sequence[int]): self.a, self.b = val class Config: field_dependencies={'c': ['a', 'b']} """ deps: dict[str, set[str]] = {} cfg_deps = model_config.get(FIELD_DEPENDENCIES, {}) # sourcery skip if not cfg_deps: cfg_deps = model_config.get("property_dependencies", {}) if cfg_deps: warnings.warn( "The 'property_dependencies' configuration key is deprecated. " "Use 'field_dependencies' instead", DeprecationWarning, stacklevel=2, ) if cfg_deps: if not isinstance(cfg_deps, dict): # pragma: no cover raise TypeError( f"Config field_dependencies must be a dict, not {cfg_deps!r}" ) for prop, fields in cfg_deps.items(): if prop not in {*model_fields, *cls.__property_setters__}: raise ValueError( "Fields with dependencies must be fields or property.setters. " f"{prop!r} is not." ) for field in fields: if field not in model_fields and not hasattr(cls, field): warnings.warn( f"property {prop!r} cannot depend on unrecognized attribute " f"name: {field!r}", stacklevel=2, ) deps.setdefault(field, set()).add(prop) if model_config.get(GUESS_PROPERTY_DEPENDENCIES, False): # if field_dependencies haven't been explicitly defined, we can glean # them from the property.fget code object: # SKIP THIS MAGIC FOR NOW? for prop, setter in cls.__property_setters__.items(): if setter.fget is not None: for name in setter.fget.__code__.co_names: if name in model_fields: deps.setdefault(name, set()).add(prop) return deps @dataclass_transform(kw_only_default=True, field_specifiers=(pydantic.Field,)) class EventedModel(pydantic.BaseModel, metaclass=EventedMetaclass): """A pydantic BaseModel that emits a signal whenever a field value is changed. !!! important This class requires `pydantic` to be installed. You can install directly (`pip install pydantic`) or by using the psygnal extra: `pip install psygnal[pydantic]` In addition to standard pydantic `BaseModel` properties (see [pydantic docs](https://pydantic-docs.helpmanual.io/usage/models/)), this class adds the following: 1. Gains an `events` attribute that is an instance of [`psygnal.SignalGroup`][]. This group will have a signal for each field in the model (excluding private attributes and non-mutable fields). Whenever a field in the model is mutated, the corresponding signal will emit with the new value (see example below). 2. Gains support for properties and property.setters (not supported in pydantic's BaseModel). Enable by adding `allow_property_setters = True` to your model `Config`. 3. If you would like properties (i.e. "computed fields") to emit an event when one of the model fields it depends on is mutated you must set one of the following options in the `Config`: - `field_dependencies` may be a `Dict[str, List[str]]`, where the keys are the names of properties, and the values are a list of field names (strings) that the property depends on for its value - `guess_property_dependencies` may be set to `True` to "guess" property dependencies by inspecting the source code of the property getter for. 4. If you would like to allow custom fields to provide their own json_encoders, you can either: 1. use the [standard pydantic method](https://pydantic-docs.helpmanual.io/usage/exporting_models) of adding json_encoders to your model, for each field type you'd like to support: 1. This `EventedModel` class will additionally look for a `_json_encode` method on any field types in the model. If a field type declares a `_json_encode` method, it will be added to the [`json_encoders`](https://pydantic-docs.helpmanual.io/usage/exporting_models/#json_encoders) dict in the model `Config`. (Prefer using the standard pydantic method) Examples -------- Standard EventedModel example: ```python class MyModel(EventedModel): x: int = 1 m = MyModel() m.events.x.connect(lambda v: print(f"new value is {v}")) m.x = 3 # prints 'new value is 3' ``` An example of using property_setters and emitting signals when a field dependency is mutated. ```python class MyModel(EventedModel): a: int = 1 b: int = 1 @property def c(self) -> List[int]: return [self.a, self.b] @c.setter def c(self, val: Sequence[int]) -> None: self.a, self.b = val class Config: allow_property_setters = True field_dependencies = {"c": ["a", "b"]} m = MyModel() assert m.c == [1, 1] m.events.c.connect(lambda v: print(f"c updated to {v}")) m.a = 2 # prints 'c updated to [2, 1]' ``` """ # add private attributes for event emission _events: ClassVar[SignalGroup] = PrivateAttr() # mapping of name -> property obj for methods that are property setters __property_setters__: ClassVar[dict[str, property]] # mapping of field name -> dependent set of property names # when field is changed, an event for dependent properties will be emitted. __field_dependents__: ClassVar[dict[str, set[str]]] __eq_operators__: ClassVar[dict[str, "EqOperator"]] __slots__ = {"__weakref__"} __signal_group__: ClassVar[type[SignalGroup]] _changes_queue: dict[str, Any] = PrivateAttr(default_factory=dict) _primary_changes: set[str] = PrivateAttr(default_factory=set) _delay_check_semaphore: int = PrivateAttr(0) if PYDANTIC_V1: class Config: # this seems to be necessary for the _json_encoders trick to work json_encoders: ClassVar[dict] = {"____": None} def __init__(_model_self_, **data: Any) -> None: super().__init__(**data) Group = _model_self_.__signal_group__ # the type error is "cannot assign to a class variable" ... # but if we don't use `ClassVar`, then the `dataclass_transform` decorator # will add _events: SignalGroup to the __init__ signature, for *all* user models _model_self_._events = Group(_model_self_) # type: ignore [misc] # expose the private SignalGroup publicly @property def events(self) -> SignalGroup: """Return the `SignalGroup` containing all events for this model.""" return self._events @property def _defaults(self) -> dict[str, Any]: return _get_defaults(self) def __eq__(self, other: Any) -> bool: """Check equality with another object. We override the pydantic approach (which just checks ``self.model_dump() == other.model_dump()``) to accommodate more complicated types like arrays, whose truth value is often ambiguous. ``__eq_operators__`` is constructed in ``EqualityMetaclass.__new__`` """ if not isinstance(other, EventedModel): return bool(_model_dump(self) == other) for f_name, _ in self.__eq_operators__.items(): if not hasattr(self, f_name) or not hasattr(other, f_name): return False # pragma: no cover a = getattr(self, f_name) b = getattr(other, f_name) if not _check_field_equality(type(self), f_name, a, b): return False return True def update(self, values: Union["EventedModel", dict], recurse: bool = True) -> None: """Update a model in place. Parameters ---------- values : Union[dict, EventedModel] Values to update the model with. If an EventedModel is passed it is first converted to a dictionary. The keys of this dictionary must be found as attributes on the current model. recurse : bool If True, recursively update fields that are EventedModels. Otherwise, just update the immediate fields of this EventedModel, which is useful when the declared field type (e.g. ``Union``) can have different realized types with different fields. """ if isinstance(values, pydantic.BaseModel): values = _model_dump(values) if not isinstance(values, dict): # pragma: no cover raise TypeError(f"values must be a dict or BaseModel. got {type(values)}") with self.events._psygnal_relay.paused(): # TODO: reduce? for key, value in values.items(): field = getattr(self, key) if isinstance(field, EventedModel) and recurse: field.update(value, recurse=recurse) else: setattr(self, key, value) def reset(self) -> None: """Reset the state of the model to default values.""" model_config = _get_config(self) model_fields = _get_fields(type(self)) for name, value in self._defaults.items(): if isinstance(value, EventedModel): cast("EventedModel", getattr(self, name)).reset() elif not model_config.get("frozen") and not model_fields[name].frozen: setattr(self, name, value) def _check_if_values_changed_and_emit_if_needed(self) -> None: """Check if field values changed and emit events if needed. This method is called when exiting a ComparisonDelayer context. It processes all queued changes, compares old vs new values, and emits signals for fields that actually changed. The advantage of moving this to the end of all modifications is that comparisons are performed only once for every potential change. """ # Early exit if we're still delaying comparisons or have no changes to check if self._delay_check_semaphore > 0 or len(self._changes_queue) == 0: return # ----------------- Process primary changes ------------------- # "Primary changes" are fields that were directly assigned to (as opposed # to dependent properties that might have changed as a side effect). # `_primary_changes` get added in the `_setattr_with_dependents_impl` method to_emit: list[tuple[str, Any]] = [] # list of (field name, new value) primary_changes_occurred = False for name in self._primary_changes: old_value = self._changes_queue[name] new_value = getattr(self, name) if not _check_field_equality(type(self), name, new_value, old_value): # This field actually changed value if name in self._events: # Field has a signal, queue it for emission to_emit.append((name, new_value)) else: # Field doesn't have a signal but might have dependents # that need checking primary_changes_occurred |= name in self.__field_dependents__ # Remove from queue since we've processed this primary change self._changes_queue.pop(name) # -------------------------------------------------------------- # If no primary changes occurred and no signals need emitting, # we can skip checking dependents (optimization) if not to_emit and not primary_changes_occurred: self._changes_queue.clear() self._primary_changes.clear() return # ---------- Process dependent property changes ---------- # Any remaining items in the changes queue are now # dependent properties that were queued for checking. for name, old_value in self._changes_queue.items(): new_value = getattr(self, name) if not _check_field_equality(type(self), name, new_value, old_value): to_emit.append((name, new_value)) # Clean up tracking state self._changes_queue.clear() self._primary_changes.clear() # Emit all the signals that need emitting # Use ComparisonDelayer to prevent re-entrancy issues when callbacks # modify the model again if to_emit: with ComparisonDelayer(self): for name, new_value in to_emit: getattr(self._events, name)(new_value) def __setattr__(self, name: str, value: Any) -> None: # can happen on init if name == "_events" or not hasattr(self, "_events"): # fallback to default behavior for special fields and during init return self._super_setattr_(name, value) # Check if this is a property setter first - property setters should # always go through _super_setattr_ regardless of signals/dependencies if name in self.__property_setters__: return self._super_setattr_(name, value) # Check if this field needs special handling (has signal or dependencies) is_signal_field = name in self._events has_dependents = self.__field_dependents__ and name in self.__field_dependents__ if is_signal_field or has_dependents: # For signal fields with no dependents, use the faster path if available if ( is_signal_field and not has_dependents and hasattr(self, "_setattr_no_dependants") ): self._setattr_no_dependants(name, value) else: # Use the full setattr method for fields with dependents self._setattr_default(name, value) else: # Field doesn't have signals or dependents, use fast path self._super_setattr_(name, value) def _super_setattr_(self, name: str, value: Any) -> None: # pydantic will raise a ValueError if extra fields are not allowed # so we first check to see if this field has a property.setter. # if so, we use it instead. if name in self.__property_setters__: # Wrap property setter calls in ComparisonDelayer to batch field changes with ComparisonDelayer(self): self.__property_setters__[name].fset(self, value) # type: ignore[misc] elif name == "_events": # pydantic v2 prohibits shadowing class vars, on instances object.__setattr__(self, name, value) else: super().__setattr__(name, value) def _setattr_default(self, name: str, value: Any) -> None: """Will be overwritten by metaclass __new__. It will become either `_setattr_no_dependants` (if the class has neither properties nor `__field_dependents__`), or `_setattr_with_dependents` if it does. """ def _setattr_no_dependants(self, name: str, value: Any) -> None: """Simple __setattr__ behavior when the class has no properties.""" group = self._events signal_instance: SignalInstance = group[name] if len(signal_instance) < 1: return self._super_setattr_(name, value) old_value = getattr(self, name, object()) self._super_setattr_(name, value) if not _check_field_equality(type(self), name, value, old_value): getattr(self._events, name)(value) def _setattr_with_dependents(self, name: str, value: Any) -> None: """__setattr__ behavior when the class has properties.""" with ComparisonDelayer(self): self._setattr_with_dependents_impl(name, value) def _setattr_with_dependents_impl(self, name: str, value: Any) -> None: """The "real" __setattr__ implementation inside of the comparison delayer.""" # if there are no listeners, we can just set the value without emitting # so first check if there are any listeners for this field or any of its # dependent properties. # note that ALL signals will have sat least one listener simply by nature of # being in the `self._events` SignalGroup. signal_group = self._events if name in signal_group: signal_instance: SignalInstance = signal_group[name] deps_with_callbacks = { dep_name for dep_name in self.__field_dependents__.get(name, ()) if len(signal_group[dep_name]) } if ( len(signal_instance) < 1 # the signal itself has no listeners and not deps_with_callbacks # no dependent properties with listeners and not len(signal_group._psygnal_relay) # no listeners on the group ): return self._super_setattr_(name, value) elif name in self.__field_dependents__: deps_with_callbacks = self.__field_dependents__[name] else: return self._super_setattr_(name, value) # pragma: no cover self._primary_changes.add(name) if name not in self._changes_queue: self._changes_queue[name] = getattr(self, name, object()) for dep in deps_with_callbacks: if dep not in self._changes_queue: self._changes_queue[dep] = getattr(self, dep, object()) self._super_setattr_(name, value) if PYDANTIC_V1: @contextmanager def enums_as_values(self, as_values: bool = True) -> Iterator[None]: """Temporarily override how enums are retrieved. Parameters ---------- as_values : bool Whether enums should be shown as values (or as enum objects), by default `True` """ before = getattr(self.Config, "use_enum_values", NULL) self.Config.use_enum_values = as_values # type: ignore try: yield finally: if before is not NULL: self.Config.use_enum_values = before # type: ignore # pragma: no cover else: delattr(self.Config, "use_enum_values") else: @classmethod @contextmanager def enums_as_values( # type: ignore [misc] # Incompatible redefinition cls, as_values: bool = True ) -> Iterator[None]: # pragma: no cover """Temporarily override how enums are retrieved. Parameters ---------- as_values : bool Whether enums should be shown as values (or as enum objects), by default `True` """ before = cls.model_config.get("use_enum_values", NULL) cls.model_config["use_enum_values"] = as_values try: yield finally: if before is not NULL: # pragma: no cover cls.model_config["use_enum_values"] = cast("bool", before) else: cls.model_config.pop("use_enum_values") psygnal-0.15.0/src/psygnal/_exceptions.py0000644000000000000000000001067015073705675015361 0ustar00from __future__ import annotations import inspect import os from contextlib import suppress from pathlib import Path from typing import TYPE_CHECKING import psygnal if TYPE_CHECKING: from collections.abc import Container, Sequence from ._signal import SignalInstance ROOT = str(Path(psygnal.__file__).parent) class EmitLoopError(Exception): """Error type raised when an exception occurs during a callback.""" __module__ = "psygnal" def __init__( self, exc: BaseException, signal: SignalInstance | None = None, recursion_depth: int = 0, reemission: str | None = None, emit_queue: Sequence[tuple] = (), ) -> None: # if isinstance(exc, EmitLoopError): # super().__init__("nested EmitLoopError.") # return self.__cause__ = exc # grab the signal name or repr if signal is None: # pragma: no cover sig_name: str = "" elif instance := signal.instance: inst_class = instance.__class__ mod = getattr(inst_class, "__module__", "") if mod: mod += "." sig_name = f"{mod}{inst_class.__qualname__}.{signal.name}" else: sig_name = signal.name msg = _build_psygnal_exception_msg(sig_name, exc, recursion_depth) # queued emission can be confusing, because the `signal.emit()` call shown # in the traceback will not match the emission that actually raised the error. if reemission == "queued" and (depth := len(emit_queue) - 1): msg += ( "\nNOTE: reemission is set to 'queued', and this error occurred " f"at a queue-depth of {depth}.\nEmitting arguments: {emit_queue[-1]})\n" ) super().__init__(msg) def _build_psygnal_exception_msg( sig_name: str, exc: BaseException, recursion_depth: int ) -> str: msg = f"\n\nWhile emitting signal {sig_name!r}, an error occurred in a callback:" line = f"\n\n {exc.__class__.__name__}: {exc}" msg += line + "\n " + "-" * (len(line) - 4) if recursion_depth: s = "s" if recursion_depth > 1 else "" msg += f"\nnested {recursion_depth} level{s} deep" # get the first frame in the stack that is not in the psygnal package stack = inspect.stack() with suppress(Exception): emit_frame = next(fi for fi in stack if ROOT not in fi.filename) msg += "\n\n SIGNAL EMISSION: \n" with suppress(IndexError): back_frame = stack[stack.index(emit_frame) + 1] msg += f" {_fmt_frame(back_frame)}\n" msg += f" {_fmt_frame(emit_frame)} # <-- SIGNAL WAS EMITTED HERE\n" # get the last frame in the traceback, the one that raised the exception if tb := exc.__traceback__: with suppress(Exception): if not (inner := inspect.getinnerframes(tb)): return msg # pragma: no cover except_frame = inner[-1] # show the immediately connected callback first first_cb = next((fi for fi in inner if ROOT not in fi.filename), None) if first_cb and first_cb != except_frame: num_inner = len(inner) - inner.index(first_cb) - 2 msg += "\n CALLBACK CHAIN:\n" msg += f" {_fmt_frame(first_cb, with_context=False)}" msg += f" ... {num_inner} more frames ...\n" # Then end with the frame that raised the exception msg += f" {_fmt_frame(except_frame)} # <-- ERROR OCCURRED HERE \n" if flocals := except_frame.frame.f_locals: if not os.getenv("PSYGNAL_HIDE_LOCALS"): msg += "\n Local variables:\n" msg += _fmt_locals(flocals) return msg def _fmt_frame(fi: inspect.FrameInfo, with_context: bool = True) -> str: msg = f"{fi.filename}:{fi.lineno} in {fi.function}\n" if with_context and (code_ctx := fi.code_context): msg += f" {code_ctx[0].strip()}" return msg def _fmt_locals( f_locals: dict, exclude: Container[str] = ("self", "cls"), name_width: int = 20 ) -> str: lines = [] for name, value in f_locals.items(): if name not in exclude: val_repr = repr(value) if len(val_repr) > 60: val_repr = val_repr[:60] + "..." # pragma: no cover lines.append("{:>{}} = {}".format(name, name_width, val_repr)) return "\n".join(lines) psygnal-0.15.0/src/psygnal/_group.py0000644000000000000000000006571315073705675014344 0ustar00"""A SignalGroup class that allows connecting to all SignalInstances on the class. Note that unlike a slot/callback connected to SignalInstance.connect, a slot connected to SignalGroup.connect does *not* receive the direct arguments that were emitted by a given SignalInstance. Instead, the slot/callback will receive an EmissionInfo named tuple, which contains `.signal`: the SignalInstance doing the emitting, and `.args`: the args that were emitted. """ from __future__ import annotations import warnings from dataclasses import dataclass from types import MappingProxyType from typing import ( TYPE_CHECKING, Any, ClassVar, Literal, SupportsIndex, overload, ) from psygnal._signal import _NULL, Signal, SignalInstance, _SignalBlocker from ._mypyc import mypyc_attr if TYPE_CHECKING: import threading from collections.abc import ( Callable, Container, Hashable, Iterable, Iterator, Mapping, ) from contextlib import AbstractContextManager from psygnal._signal import F, ReducerFunc from psygnal._weak_callback import RefErrorChoice, WeakCallback __all__ = ["EmissionInfo", "SignalGroup"] @dataclass(slots=True, frozen=True) class PathStep: """A single step in a path to a nested signal. This is used to represent a path to a signal that is nested inside an object, such as when a signal is emitted from a nested object or a list. The `EmissionInfo.path` is a tuple of `PathStep` objects, where each `PathStep` represents either: - an _attribute_ access (e.g. `.foo`) - an _index_ access (e.g. `[3]`) - a _key_ access (e.g. `['user']`) !!! note _yes, `index` and `key` both just get passed to `__getitem__` in the end, but we differentiate them here to make it clearer whether the step is a key in a `Mapping` or an index in a `Sequence`._ """ attr: str | None = None index: SupportsIndex | None = None key: Hashable | None = None def __post_init__(self) -> None: # enforce mutual exclusivity if sum(x is not None for x in (self.attr, self.index, self.key)) != 1: raise ValueError( "PathStep must have exactly one of attr, index, or key set." ) # nice debug display: .foo, [3], ['user'] def __repr__(self) -> str: # pragma: no cover if self.attr is not None: return f".{self.attr}" if self.index is not None: return f"[{int(self.index)}]" key = repr(self.key) if len(key) > 20 and not isinstance(self.key, str): key = f"{key[:17]}..." return f"[{key}]" @dataclass(slots=True, frozen=True) class EmissionInfo: """Tuple containing information about an emission event. Attributes ---------- signal : SignalInstance The SignalInstance doing the emitting args: tuple The args that were emitted path: tuple[PathStep, ...] A tuple of PathStep objects that describe the path to the signal that was emitted. This is useful for nested signals, where the path can be used to determine the hierarchy of the signals. For example, if a signal is emitted from a nested object like `bar.foo[3]['user']`, the path will contain the steps to get to that object, such as (PathStep(attr='foo'), PathStep(index=3), PathStep(key='user')). """ signal: SignalInstance args: tuple[Any, ...] path: tuple[PathStep, ...] = () def insert_path(self, *path: PathStep) -> EmissionInfo: """Return a new EmissionInfo with the given path steps inserted at the front.""" return EmissionInfo(self.signal, self.args, (*path, *self.path)) def __post_init__(self) -> None: if self.path and not all( isinstance(p, (PathStep, str, int)) for p in self.path ): raise TypeError( "EmissionInfo.path must be a tuple of PathStep objects, " "or a tuple of str or int." ) def __iter__(self) -> Iterator[Any]: # pragma: no cover """Iterate over the EmissionInfo and all nested EmissionInfos.""" warnings.warn( "`EmissionInfo.__iter__` is no longer a NamedTuple and should not be " "iterated over. Use direct attribute access instead.", RuntimeWarning, stacklevel=2, ) yield self.signal yield self.args class SignalRelay(SignalInstance): """Special SignalInstance that can be used to connect to all signals in a group. This class will rarely be instantiated by a user (or anything other than a SignalGroup). Parameters ---------- signals : Mapping[str, SignalInstance] A mapping of signal names to SignalInstance instances. instance : Any, optional An object to which this `SignalRelay` is bound, by default None """ def __init__( self, signals: Mapping[str, SignalInstance], instance: Any = None ) -> None: super().__init__(signature=(EmissionInfo,), instance=instance) self._signals = MappingProxyType(signals) self._sig_was_blocked: dict[str, bool] = {} self._on_first_connect_callbacks: list[Callable[[], None]] = [] def _append_slot(self, slot: WeakCallback) -> None: super()._append_slot(slot) if len(self._slots) == 1: self._connect_relay() def _connect_relay(self) -> None: # Call any registered callbacks on first connection for callback in self._on_first_connect_callbacks: callback() # silence any warnings about failed weakrefs (will occur in compiled version) with warnings.catch_warnings(): warnings.simplefilter("ignore") for sig in self._signals.values(): sig.connect( self._slot_relay, check_nargs=False, check_types=False, unique=True ) def _remove_slot(self, slot: int | WeakCallback | Literal["all"]) -> None: super()._remove_slot(slot) if not self._slots: self._disconnect_relay() def _disconnect_relay(self) -> None: for sig in self._signals.values(): sig.disconnect(self._slot_relay) def _slot_relay(self, *args: Any, loc: PathStep | None = None) -> None: """Relay signals from child to parent. This is an important method for SignalGroups. It is the method that is responsible for relaying signals from all child signals within the group to any connected slots on the group itself. """ emitter = Signal.current_emitter() if emitter is None: return par_path = (loc,) if loc is not None else () # If child already gave us an EmissionInfo, use it; if args and isinstance((info := args[0]), EmissionInfo): child_info = info.insert_path(*par_path) # else wrap its args unchanged else: child_info = emitter._psygnal_relocate_info_( EmissionInfo(signal=emitter, args=args, path=par_path) ) self._run_emit_loop((child_info,)) def _relay_partial(self, loc: PathStep | None) -> _relay_partial: """Return special partial that will call _slot_relay with 'loc'.""" return _relay_partial(self, loc) def connect_direct( self, slot: Callable | None = None, *, check_nargs: bool | None = None, check_types: bool | None = None, unique: bool | Literal["raise"] = False, max_args: int | None = None, ) -> Callable[[Callable], Callable] | Callable: """Connect `slot` to be called whenever *any* Signal in this group is emitted. Params are the same as `psygnal.SignalInstance.connect`. It's probably best to check whether all signals are uniform (i.e. have the same signature). Parameters ---------- slot : Callable A callable to connect to this signal. If the callable accepts less arguments than the signature of this slot, then they will be discarded when calling the slot. check_nargs : bool | None If `True` and the provided `slot` requires more positional arguments than the signature of this Signal, raise `TypeError`. by default `True`. check_types : bool | None If `True`, An additional check will be performed to make sure that types declared in the slot signature are compatible with the signature declared by this signal, by default `False`. unique : bool | Literal["raise"] If `True`, returns without connecting if the slot has already been connected. If the literal string "raise" is passed to `unique`, then a `ValueError` will be raised if the slot is already connected. By default `False`. max_args : int, optional If provided, `slot` will be called with no more more than `max_args` when this SignalInstance is emitted. (regardless of how many arguments are emitted). Returns ------- Union[Callable[[Callable], Callable], Callable] [description] """ def _inner(slot: Callable) -> Callable: for sig in self._signals.values(): sig.connect( slot, check_nargs=check_nargs, check_types=check_types, unique=unique, max_args=max_args, ) return slot return _inner if slot is None else _inner(slot) def block(self, exclude: Container[str | SignalInstance] = ()) -> None: """Block this signal and all emitters from emitting.""" super().block() for name, sig in self._signals.items(): if name in exclude or sig in exclude: continue self._sig_was_blocked[name] = sig._is_blocked sig.block() def unblock(self) -> None: """Unblock this signal and all emitters, allowing them to emit.""" super().unblock() for name, sig in self._signals.items(): if not self._sig_was_blocked.pop(name, False): sig.unblock() def blocked( self, exclude: Container[str | SignalInstance] = () ) -> AbstractContextManager[None]: """Context manager to temporarily block all emitters in this group. Parameters ---------- exclude : iterable of str or SignalInstance, optional An iterable of signal instances or names to exempt from the block, by default () """ return _SignalBlocker(self, exclude=exclude) def disconnect(self, slot: Callable | None = None, missing_ok: bool = True) -> None: """Disconnect slot from all signals. Parameters ---------- slot : callable, optional The specific slot to disconnect. If `None`, all slots will be disconnected, by default `None` missing_ok : bool, optional If `False` and the provided `slot` is not connected, raises `ValueError. by default `True` Raises ------ ValueError If `slot` is not connected and `missing_ok` is False. """ for sig in self._signals.values(): sig.disconnect(slot, missing_ok) super().disconnect(slot, missing_ok) def _slot_index(self, slot: Callable) -> int: """Get index of `slot` in `self._slots`. Return -1 if not connected. In interpreted mode, `_relay_partial` callbacks are stored as weak references to the callable object itself, so comparing the dereferenced callback to the provided `_relay_partial` works. In compiled mode (mypyc), `weak_callback` may normalize a `_relay_partial` to a strong reference to its `__call__` method (a MethodWrapperType), to avoid segfaults on garbage collection. In that case the default WeakCallback equality logic is the correct and more robust path. Therefore, try the base implementation first (which compares normalized WeakCallback objects). If that fails and we're dealing with a `_relay_partial`, fall back to comparing the dereferenced callable to the provided slot for backward compatibility. """ # First, try the standard equality path used by SignalInstance idx = super()._slot_index(slot) if idx != -1: return idx # Fallback: handle direct comparison for `_relay_partial` instances if isinstance(slot, _relay_partial): with self._lock: for i, s in enumerate(self._slots): deref = s.dereference() # Case 1: stored deref is the _relay_partial itself (interpreted) if deref == slot: return i # Case 2: compiled path where we stored __call__ bound method # (these will never hit on codecov, but they are covered in tests) owner = getattr(deref, "__self__", None) # pragma: no cover if ( isinstance(owner, _relay_partial) and owner == slot ): # pragma: no cover return i return -1 # pragma: no cover # NOTE # To developers. Avoid adding public names to this class, as it is intended to be # a container for user-determined names. If names must be added, try to prefix # with "psygnal_" to avoid conflicts with user-defined names. @mypyc_attr(allow_interpreted_subclasses=True) class SignalGroup: """A collection of signals that can be connected to as a single unit. This class is not intended to be instantiated directly. Instead, it should be subclassed, and the subclass should define Signal instances as class attributes. The SignalGroup will then automatically collect these signals and provide a SignalRelay instance (at `group.all`) that can be used to connect to all of the signals in the group. This class is used in both the EventedModels and the evented dataclass patterns. See also: `psygnal.SignalGroupDescriptor`, which provides convenient and explicit way to create a SignalGroup on a dataclass-like class. Parameters ---------- instance : Any, optional An object to which this `SignalGroup` is bound, by default None Attributes ---------- all : SignalRelay A special SignalRelay instance that can be used to connect to all signals in this group. Examples -------- ```python from psygnal import Signal, SignalGroup class MySignals(SignalGroup): sig1 = Signal() sig2 = Signal() group = MySignals() group.all.connect(print) # connect to all signals in the group list(group) # ['sig1', 'sig2'] len(group) # 2 group.sig1 is group["sig1"] # True ``` """ _psygnal_signals: ClassVar[Mapping[str, Signal]] _psygnal_uniform: ClassVar[bool] = False _psygnal_name_conflicts: ClassVar[set[str]] _psygnal_aliases: ClassVar[dict[str, str | None]] _psygnal_instances: dict[str, SignalInstance] def __init__(self, instance: Any = None) -> None: cls = type(self) if not hasattr(cls, "_psygnal_signals"): raise TypeError( "Cannot instantiate `SignalGroup` directly. Use a subclass instead." ) self._psygnal_instances = { name: ( sig._create_signal_instance(self) if name in cls._psygnal_name_conflicts else sig.__get__(self, cls) ) for name, sig in cls._psygnal_signals.items() } self._psygnal_relay = SignalRelay(self._psygnal_instances, instance) def __init_subclass__( cls, strict: bool = False, signal_aliases: Mapping[str, str | None] = {}, ) -> None: """Collects all Signal instances on the class under `cls._psygnal_signals`.""" # Collect Signals and remove from class attributes # Use dir(cls) instead of cls.__dict__ to get attributes from super() forbidden = { k for k in getattr(cls, "__dict__", ()) if k.startswith("_psygnal") } if forbidden: raise TypeError( f"SignalGroup subclass cannot have attributes starting with '_psygnal'." f" Found: {forbidden}" ) _psygnal_signals = {} for k in dir(cls): val = getattr(cls, k, None) if isinstance(val, Signal): _psygnal_signals[k] = val # Collect the Signals also from super-class # When subclassing, the Signals have been removed from the attributes, # look for cls._psygnal_signals also cls._psygnal_signals = { **getattr(cls, "_psygnal_signals", {}), **_psygnal_signals, } # Emit warning for signal names conflicting with SignalGroup attributes reserved = set(dir(SignalGroup)) cls._psygnal_name_conflicts = conflicts = { k for k in cls._psygnal_signals if k in reserved or k.startswith(("_psygnal", "psygnal")) } if conflicts: for name in conflicts: if isinstance(getattr(cls, name), Signal): delattr(cls, name) Names = "Names" if len(conflicts) > 1 else "Name" Are = "are" if len(conflicts) > 1 else "is" warnings.warn( f"{Names} {sorted(conflicts)!r} {Are} reserved. You cannot use these " "names to access SignalInstances as attributes on a SignalGroup. (You " "may still access them as keys to __getitem__: `group['name']`).", UserWarning, stacklevel=2, ) aliases = getattr(cls, "_psygnal_aliases", {}) cls._psygnal_aliases = {**aliases, **signal_aliases} cls._psygnal_uniform = _is_uniform(cls._psygnal_signals.values()) if strict and not cls._psygnal_uniform: raise TypeError( "All Signals in a strict SignalGroup must have the same signature" ) super().__init_subclass__() @property def instance(self) -> Any: """Object that owns this `SignalGroup`.""" return self._psygnal_relay.instance @property def all(self) -> SignalRelay: """SignalInstance that can be used to connect to all signals in this group. Examples -------- ```python from psygnal import Signal, SignalGroup class MySignals(SignalGroup): sig1 = Signal() sig2 = Signal() group = MySignals() group.sig2.connect(...) # connect to a single signal by name group.all.connect(...) # connect to all signals in the group ``` """ return self._psygnal_relay @property def signals(self) -> Mapping[str, SignalInstance]: """A mapping of signal names to SignalInstance instances.""" # TODO: deprecate this property # warnings.warn( # "Accessing the `signals` property on a SignalGroup is deprecated. " # "Use __iter__ to iterate over all signal names, and __getitem__ or " # "getattr to access signal instances. This will be an error in a future.", # FutureWarning, # stacklevel=2, # ) return MappingProxyType(self._psygnal_instances) def __len__(self) -> int: """Return the number of signals in the group (not including the relay).""" return len(self._psygnal_instances) def __getitem__(self, item: str) -> SignalInstance: """Get a signal instance by name.""" return self._psygnal_instances[item] # this is just here for type checking, particularly on cases # where the SignalGroup comes from the SignalGroupDescriptor # (such as in evented dataclasses). In those cases, it's hard to indicate # to mypy that all remaining attributes are SignalInstances. def __getattr__(self, __name: str) -> SignalInstance: """Get a signal instance by name.""" raise AttributeError( # pragma: no cover f"{type(self).__name__!r} object has no attribute {__name!r}" ) def __iter__(self) -> Iterator[str]: """Yield the names of all signals in the group.""" return iter(self._psygnal_instances) def __contains__(self, item: str) -> bool: """Return True if the group contains a signal with the given name.""" # this is redundant with __iter__ and can be removed, but only after # removing the deprecation warning in __getattr__ return item in self._psygnal_instances def __repr__(self) -> str: """Return repr(self).""" name = self.__class__.__name__ if self.instance is not None: owner_type = type(self.instance).__name__ owner_repr = f" on <{owner_type} instance at {id(self.instance):#x}>" else: owner_repr = "" return f"" @classmethod def psygnals_uniform(cls) -> bool: """Return true if all signals in the group have the same signature.""" return cls._psygnal_uniform @classmethod def is_uniform(cls) -> bool: """Return true if all signals in the group have the same signature.""" warnings.warn( "The `is_uniform` method on SignalGroup is deprecated. Use " "`psygnals_uniform` instead. This will be an error in v0.11.", FutureWarning, stacklevel=2, ) return cls._psygnal_uniform def __deepcopy__(self, memo: dict[int, Any]) -> SignalGroup: # TODO: # This really isn't a deep copy. Should we also copy connections? # a working deepcopy is important for pydantic support, but in most cases # it will be a group without any signals connected return type(self)(instance=self._psygnal_relay.instance) # The rest are passthrough methods to the SignalRelay. # The full signatures are here to make mypy and IDEs happy. # parity with SignalInstance methods is tested in test_group.py @overload def connect( self, *, thread: threading.Thread | Literal["main", "current"] | None = ..., check_nargs: bool | None = ..., check_types: bool | None = ..., unique: bool | Literal["raise"] = ..., max_args: int | None = None, on_ref_error: RefErrorChoice = ..., priority: int = ..., ) -> Callable[[F], F]: ... @overload def connect( self, slot: F, *, thread: threading.Thread | Literal["main", "current"] | None = ..., check_nargs: bool | None = ..., check_types: bool | None = ..., unique: bool | Literal["raise"] = ..., max_args: int | None = None, on_ref_error: RefErrorChoice = ..., priority: int = ..., ) -> F: ... def connect( self, slot: F | None = None, *, thread: threading.Thread | Literal["main", "current"] | None = None, check_nargs: bool | None = None, check_types: bool | None = None, unique: bool | Literal["raise"] = False, max_args: int | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, emit_on_evented_child_events: bool = False, ) -> Callable[[F], F] | F: if slot is None: return self._psygnal_relay.connect( thread=thread, check_nargs=check_nargs, check_types=check_types, unique=unique, max_args=max_args, on_ref_error=on_ref_error, priority=priority, emit_on_evented_child_events=emit_on_evented_child_events, ) else: return self._psygnal_relay.connect( slot, thread=thread, check_nargs=check_nargs, check_types=check_types, unique=unique, max_args=max_args, on_ref_error=on_ref_error, priority=priority, emit_on_evented_child_events=emit_on_evented_child_events, ) def connect_direct( self, slot: Callable | None = None, *, check_nargs: bool | None = None, check_types: bool | None = None, unique: bool | Literal["raise"] = False, max_args: int | None = None, ) -> Callable[[Callable], Callable] | Callable: return self._psygnal_relay.connect_direct( slot, check_nargs=check_nargs, check_types=check_types, unique=unique, max_args=max_args, ) def disconnect(self, slot: Callable | None = None, missing_ok: bool = True) -> None: return self._psygnal_relay.disconnect(slot=slot, missing_ok=missing_ok) def block(self, exclude: Container[str | SignalInstance] = ()) -> None: return self._psygnal_relay.block(exclude=exclude) def unblock(self) -> None: return self._psygnal_relay.unblock() def blocked( self, exclude: Container[str | SignalInstance] = () ) -> AbstractContextManager[None]: return self._psygnal_relay.blocked(exclude=exclude) def pause(self) -> None: return self._psygnal_relay.pause() def resume(self, reducer: ReducerFunc | None = None, initial: Any = _NULL) -> None: return self._psygnal_relay.resume(reducer=reducer, initial=initial) def paused( self, reducer: ReducerFunc | None = None, initial: Any = _NULL ) -> AbstractContextManager[None]: return self._psygnal_relay.paused(reducer=reducer, initial=initial) def _is_uniform(signals: Iterable[Signal]) -> bool: """Return True if all signals have the same signature.""" seen: set[tuple[str, ...]] = set() for s in signals: v = tuple(str(p.annotation) for p in s.signature.parameters.values()) if seen and v not in seen: # allow zero or one return False seen.add(v) return True class _relay_partial: """Small, single-purpose, mypyc-friendly variant of functools.partial. Used to call SignalRelay._slot_relay with a specific loc. __hash__ and __eq__ are implemented to make this object hashable and comparable to other _relay_partial objects, so that adding it to a set twice will not create two separate entries. """ __slots__ = ("loc", "relay") def __init__(self, relay: SignalRelay, loc: PathStep | None = None) -> None: self.relay = relay self.loc = loc def __call__(self, *args: Any) -> Any: return self.relay._slot_relay(*args, loc=self.loc) def __hash__(self) -> int: return hash((self.relay, self.loc)) def __eq__(self, other: Any) -> bool: return ( isinstance(other, _relay_partial) and self.relay == other.relay and self.loc == other.loc ) psygnal-0.15.0/src/psygnal/_group_descriptor.py0000644000000000000000000007707315073705675016604 0ustar00from __future__ import annotations import copy import operator import sys import warnings import weakref from collections.abc import Callable from contextlib import suppress from typing import ( TYPE_CHECKING, Any, ClassVar, Literal, TypeVar, cast, overload, ) from ._dataclass_utils import iter_fields from ._group import PathStep, SignalGroup from ._signal import Signal, SignalInstance if TYPE_CHECKING: from _weakref import ref as ref from collections.abc import Iterable, Mapping from typing import TypeAlias from psygnal._group import EmissionInfo from psygnal._weak_callback import RefErrorChoice, WeakCallback EqOperator: TypeAlias = Callable[[Any, Any], bool] FieldAliasFunc: TypeAlias = Callable[[str], str | None] __all__ = ["SignalGroupDescriptor", "get_evented_namespace", "is_evented"] T = TypeVar("T", bound=type) S = TypeVar("S") F = TypeVar("F", bound=Callable) _EQ_OPERATORS: dict[type, dict[str, EqOperator]] = {} _EQ_OPERATOR_NAME = "__eq_operators__" PSYGNAL_GROUP_NAME = "_psygnal_group_" PATCHED_BY_PSYGNAL = "_patched_by_psygnal_" _NULL = object() def _get_eq_operator_map(cls: type) -> dict[str, EqOperator]: """Return the map of field_name -> equality operator for the class.""" # if the class has an __eq_operators__ attribute, we use it # otherwise use/create the entry for `cls` in the global _EQ_OPERATORS map if hasattr(cls, _EQ_OPERATOR_NAME): return cast("dict", getattr(cls, _EQ_OPERATOR_NAME)) else: return _EQ_OPERATORS.setdefault(cls, {}) def _check_field_equality( cls: type, name: str, before: Any, after: Any, _fail: bool = False ) -> bool: """Test if two values are equal for a given field. This function will look for a field-specific operator in the the `__eq_operators__` attribute of the class if present, otherwise it will use the default equality operator for the type. Parameters ---------- cls : type The class that contains the field. name : str The name of the field. before : Any The value of the field before the change. after : Any The value of the field after the change. _fail : bool, optional If True, raise a ValueError if the field is not found in the class. by default False Returns ------- bool True if the values are equal, False otherwise. """ if before is _NULL: # field didn't exist to begin with (unlikely) return after is _NULL # pragma: no cover eq_map = _get_eq_operator_map(cls) # get and execute the equality operator for the field are_equal = eq_map.setdefault(name, operator.eq) try: # may fail depending on the __eq__ method for the type return bool(are_equal(after, before)) except Exception: if _fail: raise # pragma: no cover # if we fail, we try to pick a new equality operator # if it's a numpy array, we use np.array_equal # finally, fallback to operator.is_ np = sys.modules.get("numpy", None) if ( hasattr(after, "__array__") and np is not None and are_equal is not np.array_equal ): eq_map[name] = np.array_equal return _check_field_equality(cls, name, before, after, _fail=False) else: # pragma: no cover # at some point, dask array started hitting in the above condition # so we add explicit casing in _pick_equality_operator # but we keep this fallback just in case eq_map[name] = operator.is_ return _check_field_equality(cls, name, before, after, _fail=True) def _pick_equality_operator(type_: type | None) -> EqOperator: """Get the default equality operator for a given type.""" np = sys.modules.get("numpy", None) if getattr(type_, "__module__", "").startswith("dask"): # for dask, simply check if the values are the same object # this is to avoid accidentally triggering a computation with array_equal return operator.is_ if np is not None and hasattr(type_, "__array__"): return np.array_equal # type: ignore [no-any-return] return operator.eq class _DataclassFieldSignalInstance(SignalInstance): """The type of SignalInstance when emitting dataclass field changes.""" def _connect_child_event_listener(self, slot: Callable[..., Any]) -> None: # ------------ Emit this signal when the field changes ------------ # # _DataclassFieldSignalInstance is a SignalInstance that is used for fields # on evented dataclasses. For example `team.events.leader` is a # _DataclassFieldSignalInstance that emits when the leader field changes. # (e.g. `team.leader = new_leader`) # # However, by default, it does NOT emit when the leader itself changes. # (e.g. `team.leader.age = 60`) # ... because team.leader may not be an evented object, and we can't # assume that we can track changes on it. # # However, if `team.leader` IS itself an evented object, we can connect # to its events, and emit this signal when it changes. That's what we do here. # First, ensure that this SignalInstance is indeed a SignalInstance on # a SignalGroup (presumably a SignalGroupDescriptor) if not isinstance((group := self.instance), SignalGroup): return # then get the root object of the group (e.g. "team") root_object = group.instance # get the field name (e.g. "leader") representing this SignalInstance field_name = self.name # then get the value of the field (e.g. "team.leader") try: member = getattr(root_object, field_name) except Exception: return # If that member is itself evented (e.g. "team.leader" is an evented obj) # then grab the SignalGroup on it (e.g. "team.leader.events") if group := _find_signal_group(member): # next, watch for ANY changes on the member # and call the slot with the new value of the entire field, # # e.g. team.leader.events.connect(lambda: callback(team.leader)) def _on_any_change(info: EmissionInfo) -> None: new_val = getattr(root_object, field_name) # TODO somehow get old value? ... # note that old_value is available only in evented_setattr # but the `_handle_child_event_connections` could potentially be a # place to call slot(new_val, old_val)... if we can somehow # get the slot to it. old_val: Any = None slot(new_val, old_val) group.connect(_on_any_change, check_nargs=False, on_ref_error="ignore") def connect_setattr( self, obj: ref | object, attr: str, maxargs: int | None | object = 1, *, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> WeakCallback[None]: return super().connect_setattr( obj, attr, maxargs, on_ref_error=on_ref_error, priority=priority ) def _psygnal_relocate_info_(self, emission_info: EmissionInfo) -> EmissionInfo: """Relocate the emission info to the field's attribute.""" return emission_info.insert_path(PathStep(attr=self.name)) def _build_dataclass_signal_group( cls: type, signal_group_class: type[SignalGroup], equality_operators: Iterable[tuple[str, EqOperator]] | None = None, signal_aliases: Mapping[str, str | None] | FieldAliasFunc | None = None, ) -> type[SignalGroup]: """Build a SignalGroup with events for each field in a dataclass. Parameters ---------- cls : type the dataclass to look for the fields to connect with signals. signal_group_class: type[SignalGroup] SignalGroup or a subclass of it, to use as a super class. Default to SignalGroup equality_operators: Iterable[tuple[str, EqOperator]] | None If defined, a mapping of field name and equality operator to use to compare if each field was modified after being set. Default to None signal_aliases: Mapping[str, str | None] | Callable[[str], str | None] | None If defined, a mapping between field name and signal name. Field names that are not `signal_aliases` keys are not aliased (the signal name is the field name). If the dict value is None, do not create a signal associated with this field. If a callable, the signal name is the output of the function applied to the field name. If the output is None, no signal is created for this field. If None, defaults to an empty dict, no aliases. Default to None """ group_name = f"{cls.__name__}{signal_group_class.__name__}" # parse arguments _equality_operators = dict(equality_operators) if equality_operators else {} eq_map = _get_eq_operator_map(cls) # prepare signal_aliases lookup transform: FieldAliasFunc | None = None _signal_aliases: dict[str, str | None] = {} if callable(signal_aliases): transform = signal_aliases else: _signal_aliases = dict(signal_aliases) if signal_aliases else {} signal_group_sig_names = list(getattr(signal_group_class, "_psygnal_signals", {})) signal_group_sig_aliases = cast( "dict[str, str | None]", dict(getattr(signal_group_class, "_psygnal_aliases", {})), ) signals = {} # create a Signal for each field in the dataclass for name, type_ in iter_fields(cls): if name in _equality_operators: if not callable(_equality_operators[name]): # pragma: no cover raise TypeError("EqOperator must be callable") eq_map[name] = _equality_operators[name] else: eq_map[name] = _pick_equality_operator(type_) # Resolve the signal name for the field sig_name: str | None if name in _signal_aliases: # an alias has been provided in a mapping sig_name = _signal_aliases[name] elif callable(transform): # a callable has been provided _signal_aliases[name] = sig_name = transform(name) elif name in signal_group_sig_aliases: # an alias has been defined in the class _signal_aliases[name] = sig_name = signal_group_sig_aliases[name] else: # no alias has been defined, use the field name as the signal name sig_name = name # An alias mapping or callable returned `None`, skip this field if sig_name is None: continue # Repeated signal if sig_name in signals: key = next((k for k, v in _signal_aliases.items() if v == sig_name), None) warnings.warn( f"Skip signal {sig_name!r}, was already created in {group_name}, " f"from field {key}", UserWarning, stacklevel=2, ) continue if sig_name in signal_group_sig_names: warnings.warn( f"Skip signal {sig_name!r}, was already defined by " f"{signal_group_class}", UserWarning, stacklevel=2, ) continue # Create the Signal field_type = object if type_ is None else type_ signals[sig_name] = sig = Signal(field_type, field_type) # patch in our custom SignalInstance class with maxargs=1 on connect_setattr sig._signal_instance_class = _DataclassFieldSignalInstance # Create `signal_group_class` subclass with the attached signals and signal_aliases return type( group_name, (signal_group_class,), signals, signal_aliases=_signal_aliases ) def is_evented(obj: object) -> bool: """Return `True` if the object or its class has been decorated with evented. This also works for a __setattr__ method that has been patched by psygnal. """ return hasattr(obj, PSYGNAL_GROUP_NAME) or hasattr(obj, PATCHED_BY_PSYGNAL) def get_evented_namespace(obj: object) -> str | None: """Return the name of the evented SignalGroup for an object. Note: if you get the returned name as an attribute of the object, it will be a SignalGroup instance only if `obj` is an *instance* of an evented class. If `obj` is the evented class itself, it will be a `_SignalGroupDescriptor`. Examples -------- ```python from psygnal import evented, get_evented_namespace, is_evented @evented(events_namespace="my_events") class Foo: ... assert get_evented_namespace(Foo) == "my_events" assert is_evented(Foo) ``` """ return getattr(obj, PSYGNAL_GROUP_NAME, None) def get_signal_from_field(group: SignalGroup, name: str) -> SignalInstance | None: """Get the signal from a SignalGroup corresponding to the field `name`. Parameters ---------- group: SignalGroup the signal group attached to a dataclass. name: str the name of field Returns ------- SignalInstance | None the `SignalInstance` corresponding with the field `name` or None if the field was skipped or does not have an associated `SignalInstance`. """ sig_name = group._psygnal_aliases.get(name, name) if sig_name is None or sig_name not in group: return None return group[sig_name] class _changes_emitted: def __init__( self, obj: object, field: str, signal: SignalInstance, old_value: Any ) -> None: self.obj = obj self.field = field self.signal = signal self.old_value = old_value def __enter__(self) -> None: return def __exit__(self, *args: Any) -> None: new: Any = getattr(self.obj, self.field, _NULL) if not _check_field_equality(type(self.obj), self.field, self.old_value, new): self.signal.emit(new, self.old_value) SetAttr = Callable[[Any, str, Any], None] @overload def evented_setattr( signal_group_name: str, super_setattr: SetAttr, with_aliases: bool = ... ) -> SetAttr: ... @overload def evented_setattr( signal_group_name: str, super_setattr: Literal[None] | None = ..., with_aliases: bool = ..., ) -> Callable[[SetAttr], SetAttr]: ... def evented_setattr( signal_group_name: str, super_setattr: SetAttr | None = None, with_aliases: bool = True, ) -> SetAttr | Callable[[SetAttr], SetAttr]: """Create a new __setattr__ method that emits events when fields change. `signal_group_name` MUST point to an attribute on the `self` object provided to __setattr__ that obeys the following "SignalGroup interface": 1. For every "evented" field in the class, there must be a corresponding attribute on the SignalGroup instance: `assert hasattr(signal_group, attr_name)` 2. The object returned by `getattr(signal_group, attr_name)` must be a SignalInstance-like object, i.e. it must have an `emit` method that accepts one (or more) positional arguments. ```python class SignalInstanceProtocol(Protocol): def emit(self, *args: Any) -> Any: ... class SignalGroupProtocol(Protocol): def __getattr__(self, name: str) -> SignalInstanceProtocol: ... ``` Parameters ---------- signal_group_name : str, optional The name of the attribute on `self` that holds the `SignalGroup` instance, by default "_psygnal_group_". super_setattr: Callable The original __setattr__ method for the class. with_aliases: bool, optional Whether to lookup the signal name in the signal aliases mapping, by default True. This is slightly slower, and so can be set to False if you know you don't have any signal aliases. """ def _inner(super_setattr: SetAttr) -> SetAttr: # don't patch twice if getattr(super_setattr, PATCHED_BY_PSYGNAL, False): return super_setattr # pick a slightly faster signal lookup if we don't need aliases get_signal: Callable[[SignalGroup, str], SignalInstance | None] = ( get_signal_from_field if with_aliases else SignalGroup.__getitem__ ) def _setattr_and_emit_(self: object, name: str, value: Any) -> None: """New __setattr__ method that emits events when fields change.""" if name == signal_group_name: return super_setattr(self, name, value) group = cast("SignalGroup", getattr(self, signal_group_name)) if not with_aliases and name not in group: return super_setattr(self, name, value) # Get the signal for this field signal = get_signal(group, name) # Fast path: If signal doesn't exist or has no listeners, and group has # no listeners, skip all the expensive operations if signal is None or (len(signal) < 1 and not len(group._psygnal_relay)): return super_setattr(self, name, value) # Slow path: We have listeners, so do the full work old_value = getattr(self, name, None) with _changes_emitted(self, name, signal, old_value): super_setattr(self, name, value) # Only handle child events for evented fields if is_evented(value): callback = group._psygnal_relay._relay_partial(PathStep(attr=name)) _handle_child_event_connections(old_value, value, callback) setattr(_setattr_and_emit_, PATCHED_BY_PSYGNAL, True) return _setattr_and_emit_ return _inner(super_setattr) if super_setattr else _inner class SignalGroupDescriptor: """Create a [`psygnal.SignalGroup`][] on first instance attribute access. This descriptor is designed to be used as a class attribute on a dataclass-like class (e.g. a [`dataclass`](https://docs.python.org/3/library/dataclasses.html), a [`pydantic.BaseModel`](https://docs.pydantic.dev/usage/models/), an [attrs](https://www.attrs.org/en/stable/overview.html) class, a [`msgspec.Struct`](https://jcristharif.com/msgspec/structs.html)) On first access of the descriptor on an instance, it will create a [`SignalGroup`][psygnal.SignalGroup] bound to the instance, with a [`SignalInstance`][psygnal.SignalInstance] for each field in the dataclass. !!!important Using this descriptor will *patch* the class's `__setattr__` method to emit events when fields change. (That patching occurs on first access of the descriptor name on an instance). To prevent this patching, you can set `patch_setattr=False` when creating the descriptor, but then you will need to manually call `emit` on the appropriate `SignalInstance` when you want to emit an event. Or you can use `evented_setattr` yourself ```python from psygnal._group_descriptor import evented_setattr from psygnal import SignalGroupDescriptor from dataclasses import dataclass from typing import ClassVar @dataclass class Foo: x: int _events: ClassVar = SignalGroupDescriptor(patch_setattr=False) @evented_setattr("_events") # pass the name of your SignalGroup def __setattr__(self, name: str, value: Any) -> None: super().__setattr__(name, value) ``` *This currently requires a private import, please open an issue if you would like to depend on this functionality.* Parameters ---------- equality_operators : dict[str, Callable[[Any, Any], bool]], optional A dictionary mapping field names to custom equality operators, where an equality operator is a callable that accepts two arguments and returns True if the two objects are equal. This will be used when comparing the old and new values of a field to determine whether to emit an event. If not provided, the default equality operator is `operator.eq`, except for numpy arrays, where `np.array_equal` is used. warn_on_no_fields : bool, optional If `True` (the default), a warning will be emitted if no mutable dataclass-like fields are found on the object. cache_on_instance : bool, optional If `True` (the default), a newly-created SignalGroup instance will be cached on the instance itself, so that subsequent accesses to the descriptor will return the same SignalGroup instance. This makes for slightly faster subsequent access, but means that the owner instance will no longer be pickleable. If `False`, the SignalGroup instance will *still* be cached, but not on the instance itself. patch_setattr : bool, optional If `True` (the default), a new `__setattr__` method will be created that emits events when fields change. If `False`, no `__setattr__` method will be created. (This will prevent signal emission, and assumes you are using a different mechanism to emit signals when fields change.) signal_group_class : type[SignalGroup] | None, optional A custom SignalGroup class to use, SignalGroup if None, by default None collect_fields : bool, optional Create a signal for each field in the dataclass. If True, the `SignalGroup` instance will be a subclass of `signal_group_class` (SignalGroup if it is None). If False, a deepcopy of `signal_group_class` will be used. Default to True connect_child_events : bool, optional If `True`, will connect events from all fields on the dataclass whose type is also "evented" to the group on the parent object. An object is considered "evented" if the `is_evented` function returns `True` for it (i.e. it has been decorated with `@evented`, or if it has a SignalGroupDescriptor). This is useful for nested evented dataclasses, where you want to monitor events emitted from arbitrarily deep children on the parent object. By default True. signal_aliases: Mapping[str, str | None] | Callable[[str], str | None] | None If defined, a mapping between field name and signal name. Field names that are not `signal_aliases` keys are not aliased (the signal name is the field name). If the dict value is None, do not create a signal associated with this field. If a callable, the signal name is the output of the function applied to the field name. If the output is None, no signal is created for this field. If None, defaults to an empty dict, no aliases. Default to None Examples -------- ```python from typing import ClassVar from dataclasses import dataclass from psygnal import SignalGroupDescriptor @dataclass class Person: name: str age: int = 0 events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor() john = Person("John", 40) john.events.age.connect(print) john.age += 1 # prints 41 ``` """ # map of id(obj) -> SignalGroup # cached here in case the object isn't modifiable _instance_map: ClassVar[dict[int, SignalGroup]] = {} def __init__( self, *, equality_operators: dict[str, EqOperator] | None = None, warn_on_no_fields: bool = True, cache_on_instance: bool = True, patch_setattr: bool = True, signal_group_class: type[SignalGroup] | None = None, collect_fields: bool = True, connect_child_events: bool = True, signal_aliases: Mapping[str, str | None] | FieldAliasFunc | None = None, ): grp_cls = signal_group_class or SignalGroup if not (isinstance(grp_cls, type) and issubclass(grp_cls, SignalGroup)): raise TypeError( # pragma: no cover f"'signal_group_class' must be a subclass of SignalGroup, not {grp_cls}" ) if not collect_fields: if grp_cls is SignalGroup: raise ValueError( "Cannot use SignalGroup with `collect_fields=False`. " "Use a custom SignalGroup subclass instead." ) if callable(signal_aliases): raise ValueError( "Cannot use a Callable for `signal_aliases` with " "`collect_fields=False`" ) self._name: str | None = None self._eqop = tuple(equality_operators.items()) if equality_operators else None self._warn_on_no_fields = warn_on_no_fields self._cache_on_instance = cache_on_instance self._patch_setattr = patch_setattr self._connect_child_events = connect_child_events self._signal_group_class: type[SignalGroup] = grp_cls self._collect_fields = collect_fields self._signal_aliases = signal_aliases self._signal_groups: dict[int, type[SignalGroup]] = {} def __set_name__(self, owner: type, name: str) -> None: """Called when this descriptor is added to class `owner` as attribute `name`.""" self._name = name with suppress(AttributeError): # This is the flag that identifies this object as evented setattr(owner, PSYGNAL_GROUP_NAME, name) def _do_patch_setattr(self, owner: type, with_aliases: bool = True) -> None: """Patch the owner class's __setattr__ method to emit events.""" if not self._patch_setattr: return if getattr(owner.__setattr__, PATCHED_BY_PSYGNAL, False): return name = self._name if not (name and hasattr(owner, name)): # pragma: no cover # this should never happen... but if it does, we'll get errors # every time we set an attribute on the class. So raise now. raise AttributeError("SignalGroupDescriptor has not been set on the class") try: # assign a new __setattr__ method to the class owner.__setattr__ = evented_setattr( # type: ignore name, owner.__setattr__, # type: ignore with_aliases=with_aliases, ) except Exception as e: # pragma: no cover # not sure what might cause this ... but it will have consequences raise type(e)( f"Could not update __setattr__ on class: {owner}. Events will not be " "emitted when fields change." ) from e @overload def __get__(self, instance: None, owner: type) -> SignalGroupDescriptor: ... @overload def __get__(self, instance: object, owner: type) -> SignalGroup: ... def __get__( self, instance: object, owner: type ) -> SignalGroup | SignalGroupDescriptor: """Return a SignalGroup instance for `instance`.""" if instance is None: return self signal_group = self._get_signal_group(owner) # if we haven't yet instantiated a SignalGroup for this instance, # do it now and cache it. Note that we cache it here in addition to # the instance (in case the instance is not modifiable). obj_id = id(instance) if obj_id not in self._instance_map: # cache it self._instance_map[obj_id] = grp = signal_group(instance) # also *try* to set it on the instance as well, since it will skip all the # __get__ logic in the future, but if it fails, no big deal. if self._name and self._cache_on_instance: with suppress(Exception): setattr(instance, self._name, grp) # clean up the cache when the instance is deleted with suppress(TypeError): # if it's not weakref-able weakref.finalize(instance, self._instance_map.pop, obj_id, None) # Register callback to connect child events on first connection if requested if self._connect_child_events: grp._psygnal_relay._on_first_connect_callbacks.append( lambda: connect_child_events(instance, recurse=True, _group=grp) ) return self._instance_map[obj_id] def _get_signal_group(self, owner: type) -> type[SignalGroup]: type_id = id(owner) if type_id not in self._signal_groups: self._signal_groups[type_id] = self._create_group(owner) return self._signal_groups[type_id] def _create_group(self, owner: type) -> type[SignalGroup]: if not self._collect_fields: # Do not collect fields from owner class Group = copy.deepcopy(self._signal_group_class) # Add aliases if isinstance(self._signal_aliases, dict): Group._psygnal_aliases.update(self._signal_aliases) else: # Collect fields and create SignalGroup subclass Group = _build_dataclass_signal_group( owner, self._signal_group_class, equality_operators=self._eqop, signal_aliases=self._signal_aliases, ) if self._warn_on_no_fields and not Group._psygnal_signals: warnings.warn( f"No mutable fields found on class {owner}: no events will be " "emitted. (Is this a dataclass, attrs, msgspec, or pydantic model?)", stacklevel=2, ) self._do_patch_setattr(owner, with_aliases=bool(Group._psygnal_aliases)) return Group def _find_signal_group(obj: object, default_name: str = "events") -> SignalGroup | None: # look for default "events" name as well maybe_group = getattr(obj, get_evented_namespace(obj) or default_name, None) return maybe_group if isinstance(maybe_group, SignalGroup) else None def connect_child_events( obj: object, recurse: bool = False, _group: SignalGroup | None = None ) -> None: """Connect events from evented children to a parent SignalGroup. `obj` must be an evented dataclass-style object. This is useful when you have a tree of objects, and you want to connect all events from the children to the parent. Parameters. ---------- obj : object The object to connect events from. If it is not evented, this function will do nothing. recurse : bool, optional If `True`, will also connect events from all evented children of `obj`, by default `False`. _group : SignalGroup, optional (This is used internally during recursion.) The SignalGroup to connect to. If not provided, will be found by calling `get_evented_namespace(obj)`. By default None. """ if _group is None and (_group := _find_signal_group(obj)) is None: return # pragma: no cover # not evented for loc, _ in iter_fields(type(obj), exclude_frozen=True): _connect_if_evented( getattr(obj, loc, None), _group._psygnal_relay._relay_partial(PathStep(attr=loc)), recurse=recurse, ) def _connect_if_evented(obj: Any, callback: Callable, recurse: bool) -> None: """Connect a `callback` to the signal group on `obj`.""" if (signal_group := _find_signal_group(obj)) is not None: signal_group.connect( callback, check_nargs=False, check_types=False, on_ref_error="ignore", # compiled objects are not weakref-able ) if recurse: connect_child_events(obj, recurse=True, _group=signal_group) def _handle_child_event_connections( old_value: Any, new_value: Any, callback: Callable ) -> None: # Disconnect old_value if it was evented if ( is_evented(old_value) and (old_group := _find_signal_group(old_value)) is not None ): # Disconnect from the old object old_group.disconnect(callback) _connect_if_evented(new_value, callback, recurse=True) psygnal-0.15.0/src/psygnal/_mypyc.py0000644000000000000000000000067115073705675014341 0ustar00__all__ = ["mypyc_attr"] try: # mypy_extensions is not required at runtime, it's only used by mypyc # to provide type information to the mypyc compiler. # we include it in the [tool.hatch.build.targets.wheel.hooks.mypyc] # section of pyproject.toml so that it's available when building. from mypy_extensions import mypyc_attr except ImportError: def mypyc_attr(*_, **__): # type: ignore return lambda x: x psygnal-0.15.0/src/psygnal/_queue.py0000644000000000000000000000736315073705675014331 0ustar00from __future__ import annotations from collections import defaultdict from collections.abc import Callable from queue import Queue from threading import Thread, current_thread, main_thread from typing import Any, ClassVar, Literal from ._exceptions import EmitLoopError from ._weak_callback import WeakCallback Callback = Callable[[tuple[Any, ...]], Any] CbArgsTuple = tuple[Callback, tuple] class QueuedCallback(WeakCallback): """WeakCallback that queues the callback to be called on a different thread. (...rather than invoking it immediately.) Parameters ---------- wrapped : WeakCallback The actual callback to be invoked. thread : Thread | Literal["main", "current"] | None The thread on which to invoke the callback. If not provided, the main thread will be used. """ _GLOBAL_QUEUE: ClassVar[defaultdict[Thread, Queue[CbArgsTuple]]] = defaultdict( Queue ) def __init__( self, wrapped: WeakCallback, thread: Thread | Literal["main", "current"] | None = None, ) -> None: self._wrapped = wrapped # keeping the wrapped key allows this slot to be disconnected # regardless of whether it was connected with type='queue' or 'direct' ... self._key: str = wrapped._key self._max_args: int | None = wrapped._max_args self._alive: bool = wrapped._alive self._on_ref_error = wrapped._on_ref_error if thread is None or thread == "main": thread = main_thread() elif thread == "current": thread = current_thread() elif not isinstance(thread, Thread): # pragma: no cover raise TypeError( f"`thread` must be a Thread instance, not {type(thread).__name__}" ) # NOTE: for some strange reason, mypyc crashes if we use `self._thread` here # so we use `self._cbthread` instead self._cbthread = thread self.priority: int = wrapped.priority def cb(self, args: tuple = ()) -> None: if current_thread() is self._cbthread: self._wrapped.cb(args) else: QueuedCallback._GLOBAL_QUEUE[self._cbthread].put((self._wrapped.cb, args)) def dereference(self) -> Callable | None: return self._wrapped.dereference() def __eq__(self, other: object) -> bool: """Compare QueuedCallback instances for equality based on wrapped callback. This method is explicitly defined to avoid mypyc gen_glue_ne_method AssertionError when building on Python 3.11+. Without this explicit definition, mypyc tries to generate glue methods for the inheritance hierarchy and fails with an AssertionError in gen_glue_ne_method. """ if isinstance(other, QueuedCallback): return self._wrapped == other._wrapped return NotImplemented def emit_queued(thread: Thread | None = None) -> None: """Trigger emissions of all callbacks queued in the current thread. Parameters ---------- thread : Thread, optional The thread on which to invoke the callback. If not provided, the main thread will be used. Raises ------ EmitLoopError If an exception is raised while invoking a queued callback. This exception can be caught and optionally suppressed or handled by the caller, allowing the emission of other queued callbacks to continue even if one of them raises an exception. """ _thread = current_thread() if thread is None else thread queue = QueuedCallback._GLOBAL_QUEUE[_thread] while not queue.empty(): cb, args = queue.get() try: cb(args) except Exception as e: # pragma: no cover raise EmitLoopError(exc=e) from e psygnal-0.15.0/src/psygnal/_signal.py0000644000000000000000000020420315073705675014452 0ustar00"""The main Signal class and SignalInstance class. A note on the "reemission" parameter in Signal and SignalInstances. This controls the behavior of the signal when a callback emits the signal. Since it can be a little confusing, take the following example of a Signal that emits an integer. We'll connect three callbacks to it, two of which re-emit the same signal with a different value: ```python from psygnal import SignalInstance # a signal that emits an integer sig = SignalInstance((int,), reemission="...") def cb1(value: int) -> None: print(f"calling cb1 with: {value}") if value == 1: # cb1 ALSO triggers an emission of the value 2 sig.emit(2) def cb2(value: int) -> None: print(f"calling cb2 with: {value}") if value == 2: # cb2 ALSO triggers an emission of the value 3 sig.emit(3) def cb3(value: int) -> None: print(f"calling cb3 with: {value}") sig.connect(cb1) sig.connect(cb2) sig.connect(cb3) sig.emit(1) ``` with `reemission="queued"` above: you see a breadth-first pattern: ALL callbacks are called with the first emitted value, before ANY of them are called with the second emitted value (emitted by the first connected callback cb1) ``` calling cb1 with: 1 calling cb2 with: 1 calling cb3 with: 1 calling cb1 with: 2 calling cb2 with: 2 calling cb3 with: 2 calling cb1 with: 3 calling cb2 with: 3 calling cb3 with: 3 ``` with `reemission='immediate'` signals emitted by callbacks are immediately processed by all callbacks in a deeper level, before returning back to the original loop level to call the remaining callbacks with the original value. ``` calling cb1 with: 1 calling cb1 with: 2 calling cb2 with: 2 calling cb1 with: 3 calling cb2 with: 3 calling cb3 with: 3 calling cb3 with: 2 calling cb2 with: 1 calling cb3 with: 1 ``` with `reemission='latest'`, just as with 'immediate', signals emitted by callbacks are immediately processed by all callbacks in a deeper level. But in this case, the remaining callbacks in the current level are never called with the original value. ``` calling cb1 with: 1 calling cb1 with: 2 calling cb2 with: 2 calling cb1 with: 3 calling cb2 with: 3 calling cb3 with: 3 # cb2 is never called with 1 # cb3 is never called with 1 or 2 ``` The real-world scenario in which this usually arises is an EventedModel or dataclass. Evented models emit signals on `setattr`: ```python class MyModel(EventedModel): x: int = 1 m = MyModel(x=1) print("starting value", m.x) @m.events.x.connect def ensure_at_least_20(val: int): print("trying to set to", val) m.x = max(val, 20) m.x = 5 print("ending value", m.x) ``` ``` starting value 1 trying to set to 5 trying to set to 20 ending value 20 ``` With EventedModel.__setattr__, you can easily end up with some complicated recursive behavior if you connect an on-change callback that also sets the value of the model. In this case `reemission='latest'` is probably the most appropriate, as it will prevent the callback from being called with the original (now-stale) value. But one can conceive of other scenarios where `reemission='immediate'` or `reemission='queued'` might be more appropriate. Qt's default behavior, for example, is similar to `immediate`, but can also be configured to be like `queued` by changing the connection type (in that case, depending on threading). """ from __future__ import annotations import inspect import threading import warnings import weakref from collections import deque from collections.abc import Callable from contextlib import AbstractContextManager, contextmanager, suppress from functools import cache, partial, reduce from inspect import Parameter, Signature, isclass from types import UnionType from typing import ( TYPE_CHECKING, Any, ClassVar, Final, Literal, NoReturn, TypeVar, Union, cast, get_args, get_origin, get_type_hints, overload, ) from ._exceptions import EmitLoopError from ._mypyc import mypyc_attr from ._queue import QueuedCallback from ._weak_callback import ( StrongFunction, WeakCallback, WeakSetattr, WeakSetitem, weak_callback, ) if TYPE_CHECKING: from collections.abc import Container, Iterable, Iterator from typing import TypeAlias from ._group import EmissionInfo from ._weak_callback import RefErrorChoice # single function that does all the work of reducing an iterable of args # to a single args ReducerOneArg: TypeAlias = Callable[[Iterable[tuple]], tuple] # function that takes two args tuples. it will be passed to itertools.reduce ReducerTwoArgs: TypeAlias = Callable[[tuple, tuple], tuple] ReducerFunc: TypeAlias = ReducerOneArg | ReducerTwoArgs __all__ = ["Signal", "SignalInstance", "_compiled"] _NULL = object() F = TypeVar("F", bound=Callable) # using 300 instead of sys.getrecursionlimit() # in a mypyc-compiled program, hitting an actual RecursionError can cause # a segfault (rather than raise a python exception), so we really MUST # avoid it. Windows has a lower stack limit than other platforms, so we # use 300 as a cross-platform "safe" limit, determined via testing. It's # probably plenty large for most reasonable use-cases. RECURSION_LIMIT = 300 ReemissionVal = Literal["immediate", "queued", "latest-only"] VALID_REEMISSION = set(ReemissionVal.__args__) # type: ignore DEFAULT_REEMISSION: ReemissionVal = "immediate" # using basic class instead of enum for easier mypyc compatibility # this isn't exposed publicly anyway. class ReemissionMode: """Enumeration of reemission strategies.""" IMMEDIATE: Final = "immediate" QUEUED: Final = "queued" LATEST: Final = "latest-only" @staticmethod def validate(value: str) -> str: value = str(value).lower() if value not in ReemissionMode._members(): raise ValueError( f"Invalid reemission value. Must be one of " f"{', '.join(ReemissionMode._members())}. Not {value!r}" ) return value @staticmethod def _members() -> set[str]: return VALID_REEMISSION class Signal: """Declares a signal emitter on a class. This is class implements the [descriptor protocol](https://docs.python.org/3/howto/descriptor.html#descriptorhowto) and is designed to be used as a class attribute, with the supported signature types provided in the constructor: ```python from psygnal import Signal class MyEmitter: changed = Signal(int) def receiver(arg: int): print("new value:", arg) emitter = MyEmitter() emitter.changed.connect(receiver) emitter.changed.emit(1) # prints 'new value: 1' ``` !!! note in the example above, `MyEmitter.changed` is an instance of `Signal`, and `emitter.changed` is an instance of `SignalInstance`. See the documentation on [`SignalInstance`][psygnal.SignalInstance] for details on how to connect to and/or emit a signal on an instance of an object that has a `Signal`. Parameters ---------- *types : Type[Any] | Signature A sequence of individual types, or a *single* [`inspect.Signature`][] object. description : str Optional descriptive text for the signal. (not used internally). name : str | None Optional name of the signal. If it is not specified then the name of the class attribute that is bound to the signal will be used. default None check_nargs_on_connect : bool Whether to check the number of positional args against `signature` when connecting a new callback. This can also be provided at connection time using `.connect(..., check_nargs=True)`. By default, `True`. check_types_on_connect : bool Whether to check the callback parameter types against `signature` when connecting a new callback. This can also be provided at connection time using `.connect(..., check_types=True)`. By default, `False`. reemission : Literal["immediate", "queued", "latest-only"] | None Determines the order and manner in which connected callbacks are invoked when a callback re-emits a signal. Default is `"immediate"`. * `"immediate"`: Signals emitted by callbacks are immediately processed in a deeper emission loop, before returning to process signals emitted at the current level (after all callbacks in the deeper level have been called). * `"queued"`: Signals emitted by callbacks are enqueued for emission after the current level of emission is complete. This ensures *all* connected callbacks are called with the first emitted value, before *any* of them are called with values emitted while calling callbacks. * `"latest-only"`: Signals emitted by callbacks are immediately processed in a deeper emission loop, and remaining callbacks in the current level are never called with the original value. """ # _signature: Signature # callback signature for this signal _current_emitter: ClassVar[SignalInstance | None] = None def __init__( self, *types: type[Any] | Signature, description: str = "", name: str | None = None, check_nargs_on_connect: bool = True, check_types_on_connect: bool = False, reemission: ReemissionVal = DEFAULT_REEMISSION, signal_instance_class: type[SignalInstance] | None = None, ) -> None: self._name = name self.description = description self._check_nargs_on_connect = check_nargs_on_connect self._check_types_on_connect = check_types_on_connect self._reemission = reemission self._signal_instance_class = signal_instance_class or SignalInstance self._signal_instance_cache: dict[int, SignalInstance] = {} if types and isinstance(types[0], Signature): self._signature = types[0] if len(types) > 1: warnings.warn( "Only a single argument is accepted when directly providing a" f" `Signature`. These args were ignored: {types[1:]}", stacklevel=2, ) else: self._signature = _build_signature(*cast("tuple[type[Any], ...]", types)) @property def signature(self) -> Signature: """[Signature][inspect.Signature] supported by this Signal.""" return self._signature def __set_name__(self, owner: type[Any], name: str) -> None: """Set name of signal when declared as a class attribute on `owner`.""" if self._name is None: self._name = name @overload def __get__(self, instance: None, owner: type[Any] | None = None) -> Signal: ... @overload def __get__( self, instance: Any, owner: type[Any] | None = None ) -> SignalInstance: ... def __get__( self, instance: Any, owner: type[Any] | None = None ) -> Signal | SignalInstance: """Get signal instance. This is called when accessing a Signal instance. If accessed as an attribute on the class `owner`, instance, will be `None`. Otherwise, if `instance` is not None, we're being accessed on an instance of `owner`. class Emitter: signal = Signal() e = Emitter() E.signal # instance will be None, owner will be Emitter e.signal # instance will be e, owner will be Emitter Returns ------- Signal or SignalInstance Depending on how this attribute is accessed. """ if instance is None: return self if id(instance) in self._signal_instance_cache: return self._signal_instance_cache[id(instance)] signal_instance = self._create_signal_instance(instance) # cache this signal instance so that future access returns the same instance. try: # first, try to assign it to instance.name ... this essentially breaks the # descriptor, (i.e. __get__ will never again be called for this instance) # (note, this is the same mechanism used in the `cached_property` decorator) setattr(instance, cast("str", self._name), signal_instance) except AttributeError: # if that fails, which may happen in slotted classes, then we fall back to # our internal cache self._cache_signal_instance(instance, signal_instance) return signal_instance def _cache_signal_instance( self, instance: Any, signal_instance: SignalInstance ) -> None: """Cache a signal instance on the instance.""" # fallback signal instance cache as last resort. We use the object id # instead a WeakKeyDictionary because we can't guarantee that the instance # is hashable or weak-referenceable. and we use a finalize to remove the # cache when the instance is destroyed (if the object is weak-referenceable). obj_id = id(instance) self._signal_instance_cache[obj_id] = signal_instance with suppress(TypeError): weakref.finalize(instance, self._signal_instance_cache.pop, obj_id, None) def _create_signal_instance( self, instance: Any, name: str | None = None ) -> SignalInstance: return self._signal_instance_class( self.signature, instance=instance, name=name or self._name, description=self.description, check_nargs_on_connect=self._check_nargs_on_connect, check_types_on_connect=self._check_types_on_connect, reemission=self._reemission, ) @classmethod @contextmanager def _emitting(cls, emitter: SignalInstance) -> Iterator[None]: """Context that sets the sender on a receiver object while emitting a signal.""" previous, cls._current_emitter = cls._current_emitter, emitter try: yield finally: cls._current_emitter = previous @classmethod def current_emitter(cls) -> SignalInstance | None: """Return currently emitting `SignalInstance`, if any. This will typically be used in a callback. Examples -------- ```python from psygnal import Signal def my_callback(): source = Signal.current_emitter() ``` """ return cls._current_emitter @classmethod def sender(cls) -> Any: """Return currently emitting object, if any. This will typically be used in a callback. """ return getattr(cls._current_emitter, "instance", None) _empty_signature = Signature() @mypyc_attr(allow_interpreted_subclasses=True) class SignalInstance: """A signal instance (optionally) bound to an object. In most cases, users will not create a `SignalInstance` directly -- instead creating a [Signal][psygnal.Signal] class attribute. This object will be instantiated by the `Signal.__get__` method (i.e. the descriptor protocol), when a `Signal` instance is accessed from an *instance* of a class with `Signal` attribute. However, it is the `SignalInstance` that you will most often be interacting with when you access the name of a `Signal` on an instance -- so understanding the `SignalInstance` API is key to using psygnal. ```python class Emitter: signal = Signal() e = Emitter() # when accessed on an *instance* of Emitter, # the signal attribute will be a SignalInstance e.signal # This is what you will use to connect your callbacks e.signal.connect(some_callback) ``` Parameters ---------- signature : Signature | None The signature that this signal accepts and will emit, by default `Signature()`. instance : Any An object to which this signal is bound. Normally this will be provided by the `Signal.__get__` method (see above). However, an unbound `SignalInstance` may also be created directly. by default `None`. name : str | None An optional name for this signal. Normally this will be provided by the `Signal.__get__` method. by default `None` check_nargs_on_connect : bool Whether to check the number of positional args against `signature` when connecting a new callback. This can also be provided at connection time using `.connect(..., check_nargs=True)`. By default, `True`. check_types_on_connect : bool Whether to check the callback parameter types against `signature` when connecting a new callback. This can also be provided at connection time using `.connect(..., check_types=True)`. By default, `False`. reemission : Literal["immediate", "queued", "latest-only"] | None See docstring for [`Signal`][psygnal.Signal] for details. By default, `"immediate"`. description : str Optional descriptive text for the signal. (not used internally). Attributes ---------- signature : Signature Signature supported by this `SignalInstance`. instance : Any Object that emits this `SignalInstance`. name : str Name of this `SignalInstance`. description : str Description of this `SignalInstance`. Raises ------ TypeError If `signature` is neither an instance of `inspect.Signature`, or a `tuple` of types. """ _is_blocked: bool = False _is_paused: bool = False _debug_hook: ClassVar[Callable[[EmissionInfo], None] | None] = None def __init__( self, signature: Signature | tuple = _empty_signature, *, instance: Any = None, name: str | None = None, description: str = "", check_nargs_on_connect: bool = True, check_types_on_connect: bool = False, reemission: ReemissionVal = DEFAULT_REEMISSION, ) -> None: if isinstance(signature, (list, tuple)): signature = _build_signature(*signature) elif not isinstance(signature, Signature): # pragma: no cover raise TypeError( "`signature` must be either a sequence of types, or an " "instance of `inspect.Signature`" ) self._description = description self._reemission = ReemissionMode.validate(reemission) self._name = name self._instance: Callable = self._instance_ref(instance) self._args_queue: list[tuple] = [] # filled when paused self._signature = signature self._check_nargs_on_connect = check_nargs_on_connect self._check_types_on_connect = check_types_on_connect self._slots: list[WeakCallback] = [] self._is_blocked: bool = False self._is_paused: bool = False self._lock = threading.RLock() self._emit_queue: deque[tuple] = deque() self._recursion_depth: int = 0 self._max_recursion_depth: int = 0 self._run_emit_loop_inner: Callable[[], None] if self._reemission == ReemissionMode.QUEUED: self._run_emit_loop_inner = self._run_emit_loop_queued elif self._reemission == ReemissionMode.LATEST: self._run_emit_loop_inner = self._run_emit_loop_latest_only else: self._run_emit_loop_inner = self._run_emit_loop_immediate # whether any slots in self._slots have a priority other than 0 self._priority_in_use = False @staticmethod def _instance_ref(instance: Any) -> Callable[[], Any]: if instance is None: return lambda: None try: return weakref.ref(instance) except TypeError: # fall back to strong reference if instance is not weak-referenceable return lambda: instance @property def signature(self) -> Signature: """Signature supported by this `SignalInstance`.""" return self._signature @property def instance(self) -> Any: """Object that emits this `SignalInstance`.""" return self._instance() @property def name(self) -> str: """Name of this `SignalInstance`.""" return self._name or "" @property def description(self) -> str: """Description of this `SignalInstance`.""" return self._description def __repr__(self) -> str: """Return repr.""" name = f" {self._name!r}" if self._name else "" instance = f" on {self.instance!r}" if self.instance is not None else "" return f"<{type(self).__name__}{name}{instance}>" @overload def connect( self, *, thread: threading.Thread | Literal["main", "current"] | None = ..., check_nargs: bool | None = ..., check_types: bool | None = ..., unique: bool | Literal["raise"] = ..., max_args: int | None = None, on_ref_error: RefErrorChoice = ..., priority: int = ..., emit_on_evented_child_events: bool = ..., ) -> Callable[[F], F]: ... @overload def connect( self, slot: F, *, thread: threading.Thread | Literal["main", "current"] | None = ..., check_nargs: bool | None = ..., check_types: bool | None = ..., unique: bool | Literal["raise"] = ..., max_args: int | None = None, on_ref_error: RefErrorChoice = ..., priority: int = ..., emit_on_evented_child_events: bool = ..., ) -> F: ... def connect( self, slot: F | None = None, *, thread: threading.Thread | Literal["main", "current"] | None = None, check_nargs: bool | None = None, check_types: bool | None = None, unique: bool | Literal["raise"] = False, max_args: int | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, emit_on_evented_child_events: bool = False, ) -> Callable[[F], F] | F: """Connect a callback (`slot`) to this signal. `slot` is compatible if: * it requires no more than the number of positional arguments emitted by this `SignalInstance`. (It *may* require less) * it has no *required* keyword arguments (keyword only arguments that have no default). * if `check_types` is `True`, the parameter types in the callback signature must match the signature of this `SignalInstance`. This method may be used as a decorator. ```python @signal.connect def my_function(): ... ``` !!!important If a signal is connected with `thread != None`, then it is up to the user to ensure that `psygnal.emit_queued` is called, or that one of the backend convenience functions is used (e.g. `psygnal.qt.start_emitting_from_queue`). Otherwise, callbacks that are connected to signals that are emitted from another thread will never be called. Parameters ---------- slot : Callable A callable to connect to this signal. If the callable accepts less arguments than the signature of this slot, then they will be discarded when calling the slot. check_nargs : Optional[bool] If `True` and the provided `slot` requires more positional arguments than the signature of this Signal, raise `TypeError`. by default `True`. thread: Thread | Literal["main", "current"] | None If `None` (the default), this slot will be invoked immediately when a signal is emitted, from whatever thread emitted the signal. If a thread object is provided, then the callback will only be immediately invoked if the signal is emitted from that thread. Otherwise, the callback will be added to a queue. **Note!**, when using the `thread` parameter, the user is responsible for calling `psygnal.emit_queued()` in the corresponding thread, otherwise the slot will never be invoked. (See note above). (The strings `"main"` and `"current"` are also accepted, and will be interpreted as the `threading.main_thread()` and `threading.current_thread()`, respectively). check_types : Optional[bool] If `True`, An additional check will be performed to make sure that types declared in the slot signature are compatible with the signature declared by this signal, by default `False`. unique : Union[bool, str, None] If `True`, returns without connecting if the slot has already been connected. If the literal string "raise" is passed to `unique`, then a `ValueError` will be raised if the slot is already connected. By default `False`. max_args : Optional[int] If provided, `slot` will be called with no more more than `max_args` when this SignalInstance is emitted. (regardless of how many arguments are emitted). on_ref_error : {'raise', 'warn', 'ignore'}, optional What to do if a weak reference cannot be created. If 'raise', a ReferenceError will be raised. If 'warn' (default), a warning will be issued and a strong-reference will be used. If 'ignore' a strong-reference will be used (silently). priority : int The priority of the callback. This is used to determine the order in which callbacks are called when multiple are connected to the same signal. Higher priority callbacks are called first. Negative values are allowed. The default is 0. emit_on_evented_child_events : bool If `True`, and if this is a SignalInstance associated with a specific field on an evented dataclass, and if that field itself is an evented dataclass, then the slot will be called both when the field is set directly, *and* when a child member of that field is set. For example, if `Team` is an evented-dataclass with a field `leader: Person` which is itself an evented-dataclass, then `team.events.leader.connect(callback, emit_on_evented_child_events=True)` will invoke callback even when `team.leader.age` is mutated (in addition to when `team.leader` is set directly). Raises ------ TypeError If a non-callable object is provided. ValueError If the provided slot fails validation, either due to mismatched positional argument requirements, or failed type checking. ValueError If `unique` is `'raise'` and `slot` has already been connected. """ if check_nargs is None: check_nargs = self._check_nargs_on_connect if check_types is None: check_types = self._check_types_on_connect def _wrapper( slot: F, max_args: int | None = max_args, _on_ref_err: RefErrorChoice = on_ref_error, ) -> F: if not callable(slot): raise TypeError(f"Cannot connect to non-callable object: {slot}") with self._lock: if unique and slot in self: if unique == "raise": raise ValueError( "Slot already connect. Use `connect(..., unique=False)` " "to allow duplicate connections" ) return slot slot_sig: Signature | None = None if check_nargs and (max_args is None): slot_sig, max_args, isqt = self._check_nargs(slot, self.signature) if isqt: _on_ref_err = "ignore" if check_types: slot_sig = slot_sig or signature(slot) if not _parameter_types_match(slot, self.signature, slot_sig): extra = f"- Slot types {slot_sig} do not match types in signal." self._raise_connection_error(slot, extra) cb = weak_callback( slot, max_args=max_args, finalize=self._try_discard, on_ref_error=_on_ref_err, priority=priority, ) if thread is not None: cb = QueuedCallback(cb, thread=thread) self._append_slot(cb) if emit_on_evented_child_events: self._connect_child_event_listener(slot) return slot return _wrapper if slot is None else _wrapper(slot) def _connect_child_event_listener(self, slot: Callable) -> None: """Connect a child event listener to the slot. This is called when a slot is connected to this signal. It allows subclasses to connect additional event listeners to the slot. """ # implementing this as a method allows us to override/extend it in subclasses pass # pragma: no cover def _append_slot(self, slot: WeakCallback) -> None: """Append a slot to the list of slots. Implementing this as a method allows us to override/extend it in subclasses. """ # if no previously connected slots have a priority, and this slot also # has no priority, we can just (quickly) append it to the end of the list. if not self._priority_in_use: if not slot.priority: self._slots.append(slot) return # remember that we have a priority in use, so we skip this check self._priority_in_use = True # otherwise we need to (slowly) iterate over self._slots to # insert the slot in the correct position based on priority. # High priority slots are placed at the front of the list # low/negative priority slots are at the end of the list for i, s in enumerate(self._slots): if s.priority < slot.priority: self._slots.insert(i, slot) return self._slots.append(slot) def _remove_slot(self, slot: Literal["all"] | int | WeakCallback) -> None: """Remove a slot from the list of slots.""" # implementing this as a method allows us to override/extend it in subclasses if slot == "all": self._slots.clear() elif isinstance(slot, int): self._slots.pop(slot) else: self._slots.remove(cast("WeakCallback", slot)) def _try_discard(self, callback: WeakCallback, missing_ok: bool = True) -> None: """Try to discard a callback from the list of slots. Parameters ---------- callback : WeakCallback A callback to discard. missing_ok : bool, optional If `True`, do not raise an error if the callback is not found in the list. """ try: self._remove_slot(callback) except ValueError: if not missing_ok: raise def connect_setattr( self, obj: object, attr: str, maxargs: int | None | object = _NULL, *, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> WeakCallback[None]: """Bind an object attribute to the emitted value of this signal. Equivalent to calling `self.connect(functools.partial(setattr, obj, attr))`, but with additional weakref safety (i.e. a strong reference to `obj` will not be retained). The return object can be used to [`disconnect()`][psygnal.SignalInstance.disconnect], (or you can use [`disconnect_setattr()`][psygnal.SignalInstance.disconnect_setattr]). Parameters ---------- obj : object An object. attr : str The name of an attribute on `obj` that should be set to the value of this signal when emitted. maxargs : Optional[int] max number of positional args to accept on_ref_error: {'raise', 'warn', 'ignore'}, optional What to do if a weak reference cannot be created. If 'raise', a ReferenceError will be raised. If 'warn' (default), a warning will be issued and a strong-reference will be used. If 'ignore' a strong-reference will be used (silently). priority : int The priority of the callback. This is used to determine the order in which callbacks are called when multiple are connected to the same signal. Higher priority callbacks are called first. Negative values are allowed. The default is 0. Returns ------- Tuple (weakref.ref, name, callable). Reference to the object, name of the attribute, and setattr closure. Can be used to disconnect the slot. Raises ------ ValueError If this is not a single-value signal AttributeError If `obj` has no attribute `attr`. Examples -------- >>> class T: ... sig = Signal(int) >>> class SomeObj: ... x = 1 >>> t = T() >>> my_obj = SomeObj() >>> t.sig.connect_setattr(my_obj, "x") >>> t.sig.emit(5) >>> assert my_obj.x == 5 """ if maxargs is _NULL: warnings.warn( "The default value of maxargs will change from `None` to `1` in " "version 0.11. To silence this warning, provide an explicit value for " "maxargs (`None` for current behavior, `1` for future behavior).", FutureWarning, stacklevel=2, ) maxargs = None if not hasattr(obj, attr): raise AttributeError(f"Object {obj} has no attribute {attr!r}") with self._lock: caller = WeakSetattr( obj, attr, max_args=cast("int | None", maxargs), finalize=self._try_discard, on_ref_error=on_ref_error, priority=priority, ) self._append_slot(caller) return caller def disconnect_setattr( self, obj: object, attr: str, missing_ok: bool = True ) -> None: """Disconnect a previously connected attribute setter. Parameters ---------- obj : object An object. attr : str The name of an attribute on `obj` that was previously used for `connect_setattr`. missing_ok : bool If `False` and the provided `slot` is not connected, raises `ValueError`. by default `True` Raises ------ ValueError If `missing_ok` is `True` and no attribute setter is connected. """ with self._lock: cb = WeakSetattr(obj, attr, on_ref_error="ignore") self._try_discard(cb, missing_ok) def connect_setitem( self, obj: object, key: str, maxargs: int | None | object = _NULL, *, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> WeakCallback[None]: """Bind a container item (such as a dict key) to emitted value of this signal. Equivalent to calling `self.connect(functools.partial(obj.__setitem__, attr))`, but with additional weakref safety (i.e. a strong reference to `obj` will not be retained). The return object can be used to [`disconnect()`][psygnal.SignalInstance.disconnect], (or you can use [`disconnect_setitem()`][psygnal.SignalInstance.disconnect_setitem]). Parameters ---------- obj : object An object. key : str Name of the key in `obj` that should be set to the value of this signal when emitted maxargs : Optional[int] max number of positional args to accept on_ref_error: {'raise', 'warn', 'ignore'}, optional What to do if a weak reference cannot be created. If 'raise', a ReferenceError will be raised. If 'warn' (default), a warning will be issued and a strong-reference will be used. If 'ignore' a strong-reference will be used (silently). priority : int The priority of the callback. This is used to determine the order in which callbacks are called when multiple are connected to the same signal. Higher priority callbacks are called first. Negative values are allowed. The default is 0. Returns ------- Tuple (weakref.ref, name, callable). Reference to the object, name of the attribute, and setitem closure. Can be used to disconnect the slot. Raises ------ ValueError If this is not a single-value signal TypeError If `obj` does not support __setitem__. Examples -------- >>> class T: ... sig = Signal(int) >>> t = T() >>> my_obj = dict() >>> t.sig.connect_setitem(my_obj, "x") >>> t.sig.emit(5) >>> assert my_obj == {"x": 5} """ if maxargs is _NULL: warnings.warn( "The default value of maxargs will change from `None` to `1` in" "version 0.11. To silence this warning, provide an explicit value for " "maxargs (`None` for current behavior, `1` for future behavior).", FutureWarning, stacklevel=2, ) maxargs = None if not hasattr(obj, "__setitem__"): raise TypeError(f"Object {obj} does not support __setitem__") with self._lock: caller = WeakSetitem( obj, key, max_args=cast("int | None", maxargs), finalize=self._try_discard, on_ref_error=on_ref_error, priority=priority, ) self._append_slot(caller) return caller def disconnect_setitem( self, obj: object, key: str, missing_ok: bool = True ) -> None: """Disconnect a previously connected item setter. Parameters ---------- obj : object An object. key : str The name of a key in `obj` that was previously used for `connect_setitem`. missing_ok : bool If `False` and the provided `slot` is not connected, raises `ValueError`. by default `True` Raises ------ ValueError If `missing_ok` is `True` and no item setter is connected. """ if not hasattr(obj, "__setitem__"): raise TypeError(f"Object {obj} does not support __setitem__") with self._lock: caller = WeakSetitem(obj, key, on_ref_error="ignore") self._try_discard(caller, missing_ok) def _check_nargs( self, slot: Callable, spec: Signature ) -> tuple[Signature | None, int | None, bool]: """Make sure slot is compatible with signature. Also returns the maximum number of arguments that we can pass to the slot Returns ------- slot_sig : Signature | None The signature of the slot, or None if it could not be determined. maxargs : int | None The maximum number of arguments that we can pass to the slot. is_qt : bool Whether the slot is a Qt slot. """ try: slot_sig = _get_signature_possibly_qt(slot) except ValueError as e: warnings.warn( f"{e}. To silence this warning, connect with `check_nargs=False`", stacklevel=4, ) return None, None, False try: minargs, maxargs = _acceptable_posarg_range(slot_sig) except ValueError as e: if isinstance(slot, partial): raise ValueError( f"{e}. (Note: prefer using positional args with " "functools.partials when possible)." ) from e raise # if `slot` requires more arguments than we will provide, raise. if minargs > (n_spec_params := len(spec.parameters)): extra = ( f"- Slot requires at least {minargs} positional " f"arguments, but spec only provides {n_spec_params}" ) self._raise_connection_error(slot, extra) return None if isinstance(slot_sig, str) else slot_sig, maxargs, True def _raise_connection_error(self, slot: Callable, extra: str = "") -> NoReturn: name = getattr(slot, "__name__", str(slot)) msg = f"Cannot connect slot {name!r} with signature: {signature(slot)}:\n" msg += extra msg += f"\n\nAccepted signature: {self.signature}" raise ValueError(msg) def _slot_index(self, slot: Callable) -> int: """Get index of `slot` in `self._slots`. Return -1 if not connected.""" with self._lock: normed = weak_callback(slot, on_ref_error="ignore") # NOTE: # the == method here relies on the __eq__ method of each SlotCaller subclass return next((i for i, s in enumerate(self._slots) if s == normed), -1) def disconnect(self, slot: Callable | None = None, missing_ok: bool = True) -> None: """Disconnect slot from signal. Parameters ---------- slot : callable, optional The specific slot to disconnect. If `None`, all slots will be disconnected, by default `None` missing_ok : Optional[bool] If `False` and the provided `slot` is not connected, raises `ValueError. by default `True` Raises ------ ValueError If `slot` is not connected and `missing_ok` is False. """ with self._lock: if slot is None: # NOTE: clearing an empty list is actually a RuntimeError in Qt self._remove_slot("all") return idx = self._slot_index(slot) if idx != -1: self._remove_slot(idx) elif not missing_ok: raise ValueError(f"slot is not connected: {slot}") def __contains__(self, slot: Callable) -> bool: """Return `True` if slot is connected.""" # Check if slot is callable first # this change is needed for some reason after mypy v1.14.0 if callable(slot): return self._slot_index(slot) >= 0 return False # pragma: no cover def __len__(self) -> int: """Return number of connected slots.""" return len(self._slots) def emit( self, *args: Any, check_nargs: bool = False, check_types: bool = False ) -> None: """Emit this signal with arguments `args`. !!! note `check_args` and `check_types` both add overhead when calling emit. Parameters ---------- *args : Any These arguments will be passed when calling each slot (unless the slot accepts fewer arguments, in which case extra args will be discarded.) check_nargs : bool If `False` and the provided arguments cannot be successfully bound to the signature of this Signal, raise `TypeError`. Incurs some overhead. by default False. check_types : bool If `False` and the provided arguments do not match the types declared by the signature of this Signal, raise `TypeError`. Incurs some overhead. by default False. Raises ------ TypeError If `check_nargs` and/or `check_types` are `True`, and the corresponding checks fail. """ if self._is_blocked: return if check_nargs: try: self.signature.bind(*args) except TypeError as e: raise TypeError( f"Cannot emit args {args} from signal {self!r} with " f"signature {self.signature}:\n{e}" ) from e if check_types and not _parameter_types_match( lambda: None, self.signature, _build_signature(*[type(a) for a in args]) ): raise TypeError( f"Types provided to '{self.name}.emit' " f"{tuple(type(a).__name__ for a in args)} do not match signal " f"signature: {self.signature}" ) if self._is_paused: self._args_queue.append(args) return if SignalInstance._debug_hook is not None: from ._group import EmissionInfo SignalInstance._debug_hook(EmissionInfo(self, args)) self._run_emit_loop(args) def emit_fast(self, *args: Any) -> None: """Fast emit without any checks. This method can be up to 10x faster than `emit()`, but it lacks most of the features and safety checks of `emit()`. Use with caution. Specifically: - It does not support `check_nargs` or `check_types` - It does not use any thread safety locks. - It is not possible to query the emitter with `Signal.current_emitter()` - It is not possible to query the sender with `Signal.sender()` - It does not support "queued" or "latest-only" reemission modes for nested emits. It will always use "immediate" mode, wherein signals emitted by callbacks are immediately processed in a deeper emission loop. It DOES, however, support `paused()` and `blocked()` Parameters ---------- *args : Any These arguments will be passed when calling each slot (unless the slot accepts fewer arguments, in which case extra args will be discarded.) """ if self._is_blocked: return if self._is_paused: self._args_queue.append(args) return if self._recursion_depth >= RECURSION_LIMIT: raise RecursionError( f"Psygnal recursion limit ({RECURSION_LIMIT}) reached when emitting " f"signal {self.name!r} with args {args}" ) self._recursion_depth += 1 try: for caller in self._slots: caller.cb(args) except RecursionError as e: raise RecursionError( f"RecursionError when emitting signal {self.name!r} with args {args}" ) from e except EmitLoopError as e: # pragma: no cover raise e except Exception as cb_err: loop_err = EmitLoopError(exc=cb_err, signal=self).with_traceback( cb_err.__traceback__ ) # this comment will show up in the traceback raise loop_err from cb_err # emit() call ABOVE || callback error BELOW finally: if self._recursion_depth > 0: self._recursion_depth -= 1 def __call__( self, *args: Any, check_nargs: bool = False, check_types: bool = False ) -> None: """Alias for `emit()`. But prefer using `emit()` for clarity.""" return self.emit(*args, check_nargs=check_nargs, check_types=check_types) def _run_emit_loop(self, args: tuple[Any, ...]) -> None: if self._recursion_depth >= RECURSION_LIMIT: raise RecursionError("Recursion limit reached!") with self._lock: self._emit_queue.append(args) if len(self._emit_queue) > 1: return try: # allow receiver to query sender with Signal.current_emitter() self._recursion_depth += 1 self._max_recursion_depth = max( self._max_recursion_depth, self._recursion_depth ) with Signal._emitting(self): self._run_emit_loop_inner() except RecursionError as e: raise RecursionError( f"RecursionError when " f"emitting signal {self.name!r} with args {args}" ) from e except EmitLoopError as e: raise e except Exception as cb_err: loop_err = EmitLoopError( exc=cb_err, signal=self, recursion_depth=self._recursion_depth - 1, reemission=self._reemission, emit_queue=self._emit_queue, ).with_traceback(cb_err.__traceback__) # this comment will show up in the traceback raise loop_err from cb_err # emit() call ABOVE || callback error BELOW finally: self._recursion_depth -= 1 # we're back to the root level of the emit loop, reset max_depth if self._recursion_depth <= 0: self._max_recursion_depth = 0 self._recursion_depth = 0 self._emit_queue.clear() def _run_emit_loop_immediate(self) -> None: args = self._emit_queue.popleft() for caller in self._slots: caller.cb(args) def _run_emit_loop_latest_only(self) -> None: self._args = args = self._emit_queue.popleft() for caller in self._slots: if self._recursion_depth < self._max_recursion_depth: # we've already entered a deeper emit loop # we should drop the remaining slots in this round and return break self._caller = caller caller.cb(args) def _run_emit_loop_queued(self) -> None: i = 0 while i < len(self._emit_queue): args = self._emit_queue[i] for caller in self._slots: caller.cb(args) if len(self._emit_queue) > RECURSION_LIMIT: raise RecursionError i += 1 def block(self, exclude: Container[str | SignalInstance] = ()) -> None: """Block this signal from emitting. NOTE: the `exclude` argument is only for SignalGroup subclass, but we have to include it here to make mypyc happy. """ self._is_blocked = True def unblock(self) -> None: """Unblock this signal, allowing it to emit.""" self._is_blocked = False def blocked(self) -> AbstractContextManager[None]: """Context manager to temporarily block this signal. Useful if you need to temporarily block all emission of a given signal, (for example, to avoid a recursive signal loop) Examples -------- ```python class MyEmitter: changed = Signal() def make_a_change(self): self.changed.emit() obj = MyEmitter() with obj.changed.blocked() obj.make_a_change() # will NOT emit a changed signal. ``` """ return _SignalBlocker(self) def pause(self) -> None: """Pause all emission and collect *args tuples from emit(). args passed to `emit` will be collected and re-emitted when `resume()` is called. For a context manager version, see `paused()`. """ self._is_paused = True def resume(self, reducer: ReducerFunc | None = None, initial: Any = _NULL) -> None: """Resume (unpause) this signal, emitting everything in the queue. Parameters ---------- reducer : Callable | None A optional function to reduce the args collected while paused into a single emitted group of args. If not provided, all emissions will be re-emitted as they were collected when the signal is resumed. May be: - a function that takes two args tuples and returns a single args tuple. This will be passed to `functools.reduce` and is expected to reduce all collected/emitted args into a single tuple. For example, three `emit(1)` events would be reduced and re-emitted as follows: `self.emit(*functools.reduce(reducer, [(1,), (1,), (1,)]))` - a function that takes a single argument (an iterable of args tuples) and returns a tuple (the reduced args). This will be *not* be passed to `functools.reduce`. If `reducer` is a function that takes a single argument, `initial` will be ignored. initial: any, optional initial value to pass to `functools.reduce` Examples -------- >>> class T: ... sig = Signal(int) >>> t = T() >>> t.sig.pause() >>> t.sig.emit(1) >>> t.sig.emit(2) >>> t.sig.emit(3) >>> t.sig.resume(lambda a, b: (a[0].union(set(b)),), (set(),)) >>> # results in t.sig.emit({1, 2, 3}) """ self._is_paused = False # not sure why this attribute wouldn't be set, but when resuming in # EventedModel.update, it may be undefined (as seen in tests) if not getattr(self, "_args_queue", None): return if len(self._slots) == 0: self._args_queue.clear() return if reducer is not None: if len(inspect.signature(reducer).parameters) == 1: args = cast("ReducerOneArg", reducer)(self._args_queue) else: reducer = cast("ReducerTwoArgs", reducer) if initial is _NULL: args = reduce(reducer, self._args_queue) else: args = reduce(reducer, self._args_queue, initial) self._run_emit_loop(args) else: for args in self._args_queue: self._run_emit_loop(args) self._args_queue.clear() def paused( self, reducer: ReducerFunc | None = None, initial: Any = _NULL ) -> AbstractContextManager[None]: """Context manager to temporarily pause this signal. Parameters ---------- reducer : Callable | None A optional function to reduce the args collected while paused into a single emitted group of args. If not provided, all emissions will be re-emitted as they were collected when the signal is resumed. May be: - a function that takes two args tuples and returns a single args tuple. This will be passed to `functools.reduce` and is expected to reduce all collected/emitted args into a single tuple. For example, three `emit(1)` events would be reduced and re-emitted as follows: `self.emit(*functools.reduce(reducer, [(1,), (1,), (1,)]))` - a function that takes a single argument (an iterable of args tuples) and returns a tuple (the reduced args). This will be *not* be passed to `functools.reduce`. If `reducer` is a function that takes a single argument, `initial` will be ignored. initial: any, optional initial value to pass to `functools.reduce` Examples -------- >>> with obj.signal.paused(lambda a, b: (a[0].union(set(b)),), (set(),)): ... t.sig.emit(1) ... t.sig.emit(2) ... t.sig.emit(3) >>> # results in obj.signal.emit({1, 2, 3}) """ return _SignalPauser(self, reducer, initial) def __getstate__(self) -> dict: """Return dict of current state, for pickle.""" attrs = ( "_signature", "_name", "_is_blocked", "_is_paused", "_args_queue", "_check_nargs_on_connect", "_check_types_on_connect", "_emit_queue", "_priority_in_use", "_reemission", "_max_recursion_depth", "_recursion_depth", ) dd = {slot: getattr(self, slot) for slot in attrs} dd["_instance"] = self._instance() dd["_slots"] = [x for x in self._slots if isinstance(x, StrongFunction)] if len(self._slots) > len(dd["_slots"]): warnings.warn( "Pickling a SignalInstance does not copy connected weakly referenced " "slots.", stacklevel=2, ) return dd def __setstate__(self, state: dict) -> None: """Restore state from pickle.""" # don't use __dict__, mypyc doesn't have it for k, v in state.items(): if k == "_instance": self._instance = self._instance_ref(v) else: setattr(self, k, v) self._lock = threading.RLock() if self._reemission == ReemissionMode.QUEUED: # pragma: no cover self._run_emit_loop_inner = self._run_emit_loop_queued elif self._reemission == ReemissionMode.LATEST: # pragma: no cover self._run_emit_loop_inner = self._run_emit_loop_latest_only else: self._run_emit_loop_inner = self._run_emit_loop_immediate def _psygnal_relocate_info_(self, emission_info: EmissionInfo) -> EmissionInfo: """Hook to modify emission info before it is emitted. This hook is invoked by _group.SignalRelay._slot_relay and it allows a specific signal to modify the emission info before it is passed to the slot. Most often, callers will want to use `emission_info.insert_path` to change the path. This is only relevant for emission events that are relayed through a SignalRelay, such as when a signal is being re-emitted as a group signal. By default, this method returns the emission info unchanged. """ return emission_info class _SignalBlocker: """Context manager to block and unblock a signal.""" def __init__( self, signal: SignalInstance, exclude: Container[str | SignalInstance] = () ) -> None: self._signal = signal self._exclude = exclude self._was_blocked = signal._is_blocked def __enter__(self) -> None: self._signal.block(exclude=self._exclude) def __exit__(self, *args: Any) -> None: if not self._was_blocked: self._signal.unblock() class _SignalPauser: """Context manager to pause and resume a signal.""" def __init__( self, signal: SignalInstance, reducer: ReducerFunc | None, initial: Any ) -> None: self._was_paused = signal._is_paused self._signal = signal self._reducer = reducer self._initial = initial def __enter__(self) -> None: self._signal.pause() def __exit__(self, *args: Any) -> None: if not self._was_paused: self._signal.resume(self._reducer, self._initial) # ############################################################################# # ############################################################################# def signature(obj: Any) -> inspect.Signature: try: return inspect.signature(obj) except ValueError as e: with suppress(Exception): if not inspect.ismethod(obj): return _stub_sig(obj) raise e from e _ANYSIG = Signature( [ Parameter(name="args", kind=Parameter.VAR_POSITIONAL), Parameter(name="kwargs", kind=Parameter.VAR_KEYWORD), ] ) @cache def _stub_sig(obj: Any) -> Signature: """Called as a backup when inspect.signature fails.""" import builtins # this nonsense is here because it's hard to get the signature of mypyc-compiled # objects, but we still want to be able to connect a signal instance. if ( type(getattr(obj, "__self__", None)) is SignalInstance and getattr(obj, "__name__", None) == "emit" ) or type(obj) is SignalInstance: # we won't reach this in testing because # Compiled functions don't trigger profiling and tracing hooks return _ANYSIG # pragma: no cover # just a common case if obj is builtins.print: params = [ Parameter(name="value", kind=Parameter.VAR_POSITIONAL), Parameter(name="sep", kind=Parameter.KEYWORD_ONLY, default=" "), Parameter(name="end", kind=Parameter.KEYWORD_ONLY, default="\n"), Parameter(name="file", kind=Parameter.KEYWORD_ONLY, default=None), Parameter(name="flush", kind=Parameter.KEYWORD_ONLY, default=False), ] return Signature(params) raise ValueError("unknown object") def _build_signature(*types: type[Any]) -> Signature: params = [ Parameter(name=f"p{i}", kind=Parameter.POSITIONAL_ONLY, annotation=t) for i, t in enumerate(types) ] return Signature(params) # def f(a, /, b, c=None, *d, f=None, **g): print(locals()) # # a: kind=POSITIONAL_ONLY, default=Parameter.empty # 1 required posarg # b: kind=POSITIONAL_OR_KEYWORD, default=Parameter.empty # 1 requires posarg # c: kind=POSITIONAL_OR_KEYWORD, default=None # 1 optional posarg # d: kind=VAR_POSITIONAL, default=Parameter.empty # N optional posargs # e: kind=KEYWORD_ONLY, default=Parameter.empty # 1 REQUIRED kwarg # f: kind=KEYWORD_ONLY, default=None # 1 optional kwarg # g: kind=VAR_KEYWORD, default=Parameter.empty # N optional kwargs def _get_signature_possibly_qt(slot: Callable) -> Signature | str: # checking qt has to come first, since the signature of the emit method # of a Qt SignalInstance is just None> # https://bugreports.qt.io/browse/PYSIDE-1713 sig = _guess_qtsignal_signature(slot) return signature(slot) if sig is None else sig def _acceptable_posarg_range( sig: Signature | str, forbid_required_kwarg: bool = True ) -> tuple[int, int | None]: """Return tuple of (min, max) accepted positional arguments. Parameters ---------- sig : Signature Signature object to evaluate forbid_required_kwarg : Optional[bool] Whether to allow required KEYWORD_ONLY parameters. by default True. Returns ------- arg_range : Tuple[int, int] minimum, maximum number of acceptable positional arguments Raises ------ ValueError If the signature has a required keyword_only parameter and `forbid_required_kwarg` is `True`. """ if isinstance(sig, str): if "(" not in sig: # pragma: no cover raise ValueError(f"Unrecognized string signature format: {sig!r}") inner = sig.split("(", 1)[1].split(")", 1)[0] minargs = maxargs = inner.count(",") + 1 if inner else 0 return minargs, maxargs required = 0 optional = 0 posargs_unlimited = False _pos_required = {Parameter.POSITIONAL_ONLY, Parameter.POSITIONAL_OR_KEYWORD} for param in sig.parameters.values(): if param.kind in _pos_required: if param.default is Parameter.empty: required += 1 else: optional += 1 elif param.kind is Parameter.VAR_POSITIONAL: posargs_unlimited = True elif ( param.kind is Parameter.KEYWORD_ONLY and param.default is Parameter.empty and forbid_required_kwarg ): raise ValueError(f"Unsupported KEYWORD_ONLY parameters in signature: {sig}") return (required, None if posargs_unlimited else required + optional) def _parameter_types_match( function: Callable, spec: Signature, func_sig: Signature | None = None ) -> bool: """Return True if types in `function` signature match those in `spec`. Parameters ---------- function : Callable A function to validate spec : Signature The Signature against which the `function` should be validated. func_sig : Signature, optional Signature for `function`, if `None`, signature will be inspected. by default None Returns ------- bool True if the parameter types match. """ fsig = func_sig or signature(function) func_hints: dict | None = None for f_param, spec_param in zip( fsig.parameters.values(), spec.parameters.values(), strict=False ): f_anno = f_param.annotation if f_anno is fsig.empty: # if function parameter is not type annotated, allow it. continue if isinstance(f_anno, str): if func_hints is None: func_hints = get_type_hints(function) f_anno = func_hints.get(f_param.name) if not _is_subclass(f_anno, spec_param.annotation): return False return True def _is_subclass(left: type[Any], right: type) -> bool: """Variant of issubclass with support for unions.""" if not isclass(left) and get_origin(left) in {Union, UnionType}: return any(issubclass(i, right) for i in get_args(left)) return issubclass(left, right) def _guess_qtsignal_signature(obj: Any) -> str | None: """Return string signature if `obj` is a SignalInstance or Qt emit method. This is a bit of a hack, but we found no better way: https://stackoverflow.com/q/69976089/1631624 https://bugreports.qt.io/browse/PYSIDE-1713 """ # on my machine, this takes ~700ns on PyQt5 and 8.7µs on PySide2 type_ = type(obj) if "pyqtBoundSignal" in type_.__name__: return cast("str", obj.signal) qualname = getattr(obj, "__qualname__", "") if qualname == "pyqtBoundSignal.emit": return cast("str", obj.__self__.signal) # note: this IS all actually covered in tests... but only in the Qt tests, # so it (annoyingly) briefly looks like it fails coverage. if qualname == "SignalInstance.emit" and type_.__name__.startswith("builtin"): # we likely have the emit method of a SignalInstance # call it with ridiculous params to get the err return _ridiculously_call_emit(obj.__self__.emit) # pragma: no cover if "SignalInstance" in type_.__name__ and "QtCore" in getattr( type_, "__module__", "" ): # pragma: no cover return _ridiculously_call_emit(obj.emit) return None _CRAZY_ARGS = (1,) * 255 # note: this IS all actually covered in tests... but only in the Qt tests, # so it (annoyingly) briefly looks like it fails coverage. def _ridiculously_call_emit(emitter: Any) -> str | None: # pragma: no cover """Call SignalInstance emit() to get the signature from err message.""" try: emitter(*_CRAZY_ARGS) except TypeError as e: if "only accepts" in str(e): return str(e).split("only accepts")[0].strip() return None # pragma: no cover _compiled: bool def __getattr__(name: str) -> Any: if name == "_compiled": return hasattr(Signal, "__mypyc_attrs__") raise AttributeError(f"module {__name__!r} has no attribute {name!r}") psygnal-0.15.0/src/psygnal/_throttler.py0000644000000000000000000002023315073705675015223 0ustar00from __future__ import annotations from threading import Timer from typing import TYPE_CHECKING, Any, Generic, Literal, TypeVar if TYPE_CHECKING: import inspect from collections.abc import Callable from typing import ParamSpec Kind = Literal["throttler", "debouncer"] EmissionPolicy = Literal["trailing", "leading"] P = ParamSpec("P") else: # just so that we don't have to depend on a new version of typing_extensions # at runtime P = TypeVar("P") class _ThrottlerBase(Generic[P]): _timer: Timer def __init__( self, func: Callable[P, Any], interval: int = 100, policy: EmissionPolicy = "leading", ) -> None: self.__wrapped__: Callable[P, Any] = func self._interval: float = interval / 1000 self._policy: EmissionPolicy = policy self._has_pending: bool = False self._timer: Timer = Timer(0, lambda: None) self._timer.start() self._args: tuple[Any, ...] = () self._kwargs: dict[str, Any] = {} # this mimics what functools.wraps does, but avoids __dict__ usage and other # things that won't work with mypyc... HOWEVER, most of these dynamic # assignments won't work in mypyc anyway (they just do nothing.) self.__module__: str = getattr(func, "__module__", "") self.__name__: str = getattr(func, "__name__", "") self.__qualname__: str = getattr(func, "__qualname__", "") self.__doc__: str | None = getattr(func, "__doc__", None) self.__annotations__: dict[str, Any] = getattr(func, "__annotations__", {}) def _actually_call(self) -> None: self._has_pending = False self.__wrapped__(*self._args, **self._kwargs) self._start_timer() def _call_if_has_pending(self) -> None: if self._has_pending: self._actually_call() def _start_timer(self) -> None: self._timer.cancel() self._timer = Timer(self._interval, self._call_if_has_pending) self._timer.start() def cancel(self) -> None: """Cancel any pending calls.""" self._has_pending = False self._timer.cancel() def flush(self) -> None: """Force a call if there is one pending.""" self._call_if_has_pending() def __call__(self, *args: P.args, **kwargs: P.kwargs) -> None: raise NotImplementedError("Subclasses must implement this method.") @property def __signature__(self) -> inspect.Signature: import inspect return inspect.signature(self.__wrapped__) class Throttler(_ThrottlerBase, Generic[P]): """Class that prevents calling `func` more than once per `interval`. Parameters ---------- func : Callable[P, Any] a function to wrap interval : int, optional the minimum interval in ms that must pass before the function is called again, by default 100 policy : EmissionPolicy, optional Whether to invoke the function on the "leading" or "trailing" edge of the wait timer, by default "leading" """ _timer: Timer def __init__( self, func: Callable[P, Any], interval: int = 100, policy: EmissionPolicy = "leading", ) -> None: super().__init__(func, interval, policy) def __call__(self, *args: P.args, **kwargs: P.kwargs) -> None: """Call underlying function.""" self._has_pending = True self._args = args self._kwargs = kwargs if not self._timer.is_alive(): if self._policy == "leading": self._actually_call() else: self._start_timer() class Debouncer(_ThrottlerBase, Generic[P]): """Class that waits at least `interval` before calling `func`. Parameters ---------- func : Callable[P, Any] a function to wrap interval : int, optional the minimum interval in ms that must pass before the function is called again, by default 100 policy : EmissionPolicy, optional Whether to invoke the function on the "leading" or "trailing" edge of the wait timer, by default "trailing" """ _timer: Timer def __init__( self, func: Callable[P, Any], interval: int = 100, policy: EmissionPolicy = "trailing", ) -> None: super().__init__(func, interval, policy) def __call__(self, *args: P.args, **kwargs: P.kwargs) -> None: """Call underlying function.""" self._has_pending = True self._args = args self._kwargs = kwargs if not self._timer.is_alive() and self._policy == "leading": self._actually_call() self._start_timer() def throttled( func: Callable[P, Any] | None = None, timeout: int = 100, leading: bool = True, ) -> Throttler[P] | Callable[[Callable[P, Any]], Throttler[P]]: """Create a throttled function that invokes func at most once per timeout. The throttled function comes with a `cancel` method to cancel delayed func invocations and a `flush` method to immediately invoke them. Options to indicate whether func should be invoked on the leading and/or trailing edge of the wait timeout. The func is invoked with the last arguments provided to the throttled function. Subsequent calls to the throttled function return the result of the last func invocation. This decorator may be used with or without parameters. Parameters ---------- func : Callable A function to throttle timeout : int Timeout in milliseconds to wait before allowing another call, by default 100 leading : bool Whether to invoke the function on the leading edge of the wait timer, by default True Examples -------- ```python from psygnal import Signal, throttled class MyEmitter: changed = Signal(int) def on_change(val: int) # do something possibly expensive ... emitter = MyEmitter() # connect the `on_change` whenever `emitter.changed` is emitted # BUT, no more than once every 50 milliseconds emitter.changed.connect(throttled(on_change, timeout=50)) ``` """ def deco(func: Callable[P, Any]) -> Throttler[P]: policy: EmissionPolicy = "leading" if leading else "trailing" return Throttler(func, timeout, policy) return deco(func) if func is not None else deco def debounced( func: Callable[P, Any] | None = None, timeout: int = 100, leading: bool = False, ) -> Debouncer[P] | Callable[[Callable[P, Any]], Debouncer[P]]: """Create a debounced function that delays invoking `func`. `func` will not be invoked until `timeout` ms have elapsed since the last time the debounced function was invoked. The debounced function comes with a `cancel` method to cancel delayed func invocations and a `flush` method to immediately invoke them. Options indicate whether func should be invoked on the leading and/or trailing edge of the wait timeout. The func is invoked with the *last* arguments provided to the debounced function. Subsequent calls to the debounced function return the result of the last `func` invocation. This decorator may be used with or without parameters. Parameters ---------- func : Callable A function to throttle timeout : int Timeout in milliseconds to wait before allowing another call, by default 100 leading : bool Whether to invoke the function on the leading edge of the wait timer, by default False Examples -------- ```python from psygnal import Signal, debounced class MyEmitter: changed = Signal(int) def on_change(val: int) # do something possibly expensive ... emitter = MyEmitter() # connect the `on_change` whenever `emitter.changed` is emitted # ONLY once at least 50 milliseconds have passed since the last signal emission. emitter.changed.connect(debounced(on_change, timeout=50)) ``` """ def deco(func: Callable[P, Any]) -> Debouncer[P]: policy: EmissionPolicy = "leading" if leading else "trailing" return Debouncer(func, timeout, policy) return deco(func) if func is not None else deco psygnal-0.15.0/src/psygnal/_throttler.pyi0000644000000000000000000000436715073705675015406 0ustar00# this pyi file exists until we can use ParamSpec with mypyc in the main file. from collections.abc import Callable from typing import Any, Generic, Literal, ParamSpec, overload P = ParamSpec("P") Kind = Literal["throttler", "debouncer"] EmissionPolicy = Literal["trailing", "leading"] class _ThrottlerBase(Generic[P]): def __init__( self, func: Callable[P, Any], interval: int = 100, policy: EmissionPolicy = "leading", ) -> None: ... def cancel(self) -> None: """Cancel any pending calls.""" def flush(self) -> None: """Force a call if there is one pending.""" def __call__(self, *args: P.args, **kwargs: P.kwargs) -> None: ... class Throttler(_ThrottlerBase[P]): """Class that prevents calling `func` more than once per `interval`. Parameters ---------- func : Callable[P, Any] a function to wrap interval : int, optional the minimum interval in ms that must pass before the function is called again, by default 100 policy : EmissionPolicy, optional Whether to invoke the function on the "leading" or "trailing" edge of the wait timer, by default "leading" """ class Debouncer(_ThrottlerBase[P]): """Class that waits at least `interval` before calling `func`. Parameters ---------- func : Callable[P, Any] a function to wrap interval : int, optional the minimum interval in ms that must pass before the function is called again, by default 100 policy : EmissionPolicy, optional Whether to invoke the function on the "leading" or "trailing" edge of the wait timer, by default "trailing" """ @overload def throttled( func: Callable[P, Any], timeout: int = 100, leading: bool = True, ) -> Throttler[P]: ... @overload def throttled( func: Literal[None] | None = None, timeout: int = 100, leading: bool = True, ) -> Callable[[Callable[P, Any]], Throttler[P]]: ... @overload def debounced( func: Callable[P, Any], timeout: int = 100, leading: bool = False, ) -> Debouncer[P]: ... @overload def debounced( func: Literal[None] | None = None, timeout: int = 100, leading: bool = False, ) -> Callable[[Callable[P, Any]], Debouncer[P]]: ... psygnal-0.15.0/src/psygnal/_weak_callback.py0000644000000000000000000005733315073705675015752 0ustar00from __future__ import annotations import inspect import sys import warnings import weakref from functools import partial from types import BuiltinMethodType, FunctionType, MethodType, MethodWrapperType from typing import ( TYPE_CHECKING, Any, Generic, Literal, Protocol, TypeVar, cast, ) from warnings import warn from ._async import get_async_backend from ._mypyc import mypyc_attr if TYPE_CHECKING: from collections.abc import Callable from typing import TypeAlias, TypeGuard import toolz from ._async import _AsyncBackend RefErrorChoice: TypeAlias = Literal["raise", "warn", "ignore"] __all__ = ["WeakCallback", "weak_callback"] _T = TypeVar("_T") _R = TypeVar("_R") # return type of cb def _is_toolz_curry(obj: Any) -> TypeGuard[toolz.curry]: """Return True if obj is a toolz.curry object.""" tz = sys.modules.get("toolz") return False if tz is None else isinstance(obj, tz.curry) def weak_callback( cb: Callable[..., _R] | WeakCallback[_R], *args: Any, max_args: int | None = None, finalize: Callable[[WeakCallback], Any] | None = None, strong_func: bool = True, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> WeakCallback[_R]: """Create a weakly-referenced callback. This function creates a weakly-referenced callback, with special considerations for many known callable types (functions, lambdas, partials, bound methods, partials on bound methods, builtin methods, etc.). NOTE: For the sake of least-surprise, an exception is made for functions and, lambdas, which are strongly-referenced by default. See the `strong_func` parameter for more details. Parameters ---------- cb : callable The callable to be called. *args Additional positional arguments to be passed to the callback (similar to functools.partial). max_args : int, optional The maximum number of positional arguments to pass to the callback. If provided, additional arguments passed to WeakCallback.cb will be ignored. finalize : callable, optional A callable that will be called when the callback is garbage collected. The callable will be passed the WeakCallback instance as its only argument. strong_func : bool, optional If True (default), a strong reference will be kept to the function `cb` if it is a function or lambda. If False, a weak reference will be kept. The reasoning for this is that functions and lambdas are very often defined *only* to be passed to this function, and would likely be immediately garbage collected if we weakly referenced them. If you would specifically like to *allow* the function to be garbage collected, set this to False. on_ref_error : {'raise', 'warn', 'ignore'}, optional What to do if a weak reference cannot be created. If 'raise', a ReferenceError will be raised. If 'warn' (default), a warning will be issued and a strong-reference will be used. If 'ignore' a strong-reference will be used (silently). priority : int, optional The priority of the callback. This is used to determine the order in which callbacks are called when multiple are connected to the same signal. Higher priority callbacks are called first. Negative values are allowed. The default is 0. Returns ------- WeakCallback A WeakCallback subclass instance appropriate for the given callable. The fast way to "call" the callback is to use the `cb` method, passing a single args tuple, it returns nothing. A `__call__` method is also provided, that can be used to call the original function as usual. Examples -------- ```python from psygnal._weak_callback import weak_callback class T: def greet(self, name): print("hello,", name) def _on_delete(weak_cb): print("deleting!") t = T() weak_cb = weak_callback(t.greet, finalize=_on_delete) weak_cb.cb(("world",)) # "hello, world" del t # "deleting!" weak_cb.cb(("world",)) # ReferenceError ``` """ if isinstance(cb, WeakCallback): return cb kwargs: dict[str, Any] | None = None if isinstance(cb, partial): args = cb.args + args kwargs = cb.keywords cb = cb.func is_coro = inspect.iscoroutinefunction(cb) if is_coro: if (backend := get_async_backend()) is None: raise RuntimeError( "Cannot create async callback yet... No async backend set. " "Please call `psygnal.set_async_backend()` before connecting." ) if not backend.running.is_set(): warnings.warn( f"\n\nConnection of async {cb.__name__!r} will not do anything!\n" "Async backend not running. Launch `get_async_backend().run()` " "in a background task and wait for `backend.running`", RuntimeWarning, stacklevel=2, ) if isinstance(cb, FunctionType): # NOTE: I know it looks like this should be easy to express in much shorter # syntax ... but mypyc will likely fail at runtime. # Make sure to test compiled version if you change this. if strong_func: if is_coro: return StrongCoroutineFunction( cb, max_args, args, kwargs, priority=priority ) return StrongFunction(cb, max_args, args, kwargs, priority=priority) else: if is_coro: return WeakCoroutineFunction( cb, max_args, args, kwargs, finalize, on_ref_error=on_ref_error, priority=priority, ) return WeakFunction( cb, max_args, args, kwargs, finalize, on_ref_error, priority=priority ) if isinstance(cb, MethodType): if getattr(cb, "__name__", None) == "__setitem__": try: key = args[0] except IndexError as e: # pragma: no cover raise TypeError( "WeakCallback.__setitem__ requires a key argument" ) from e obj = cast("SupportsSetitem", cb.__self__) return WeakSetitem( obj, key, max_args, finalize, on_ref_error, priority=priority ) if is_coro: return WeakCoroutineMethod( cb, max_args, args, kwargs, finalize, on_ref_error, priority=priority ) return WeakMethod( cb, max_args, args, kwargs, finalize, on_ref_error, priority=priority ) if isinstance(cb, (MethodWrapperType, BuiltinMethodType)): if kwargs: # pragma: no cover raise NotImplementedError( "MethodWrapperTypes do not support keyword arguments" ) if cb is setattr: try: obj, attr = args[:2] except IndexError as e: # pragma: no cover raise TypeError( "setattr requires two arguments, an object and an attribute name." ) from e return WeakSetattr( obj, attr, max_args, finalize, on_ref_error, priority=priority ) return WeakBuiltin( cb, max_args, args, finalize, on_ref_error, priority=priority ) if _is_toolz_curry(cb): cb_partial = getattr(cb, "_partial", None) if cb_partial is None: # pragma: no cover raise TypeError( "toolz.curry object found without a '_partial' attribute. This " "version of toolz is not supported. Please open an issue at psygnal." ) return weak_callback( cb_partial, *args, max_args=max_args, finalize=finalize, on_ref_error=on_ref_error, priority=priority, ) if callable(cb): # this is a bit of hack to workaround a segfault observed in testing # on python <=3.11 when compiled by mypyc, # during _weak_callback___WeakFunction_traverse # it specifically happens with MethodWrapperType objects, that I think are made # by mypyc itself. So we just don't attempt to weakref them here anymore. _call = getattr(cb, "__call__", None) # noqa if isinstance(_call, MethodWrapperType): return StrongFunction(_call, max_args, args, kwargs, priority=priority) return WeakFunction( cb, max_args, args, kwargs, finalize, on_ref_error, priority=priority ) raise TypeError(f"unsupported type {type(cb)}") # pragma: no cover class WeakCallback(Generic[_R]): """Abstract Base Class for weakly-referenced callbacks. Do not instantiate this class directly, use the `weak_callback` function instead. The main public-facing methods of all subclasses are: cb(args: tuple[Any, ...] = ()) -> None: special fast callback method, args only. dereference() -> Callable[..., _R] | None: return strong dereferenced callback. __call__(*args: Any, **kwargs: Any) -> _R: call original callback __eq__: compare two WeakCallback instances for equality object_key: static method that returns a unique key for an object. NOTE: can't use ABC here because then mypyc and PySide2 don't play nice together. """ def __init__( self, obj: Any, max_args: int | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: self._key: str = WeakCallback.object_key(obj) self._obj_module: str = getattr(obj, "__module__", None) or "" self._obj_qualname: str = getattr(obj, "__qualname__", "") self._object_repr: str = WeakCallback.object_repr(obj) self._max_args: int | None = max_args self._alive: bool = True self._on_ref_error: RefErrorChoice = on_ref_error self.priority: int = priority def cb(self, args: tuple[Any, ...] = ()) -> None: """Call the callback with `args`. Args will be spread when calling the func.""" raise NotImplementedError() def dereference(self) -> Callable[..., _R] | None: """Return the original object, or None if dead.""" raise NotImplementedError() def __call__(self, *args: Any, **kwds: Any) -> _R: func = self.dereference() if func is None: raise ReferenceError("callback is dead") if self._max_args is not None: args = args[: self._max_args] return func(*args, **kwds) def __eq__(self, other: object) -> bool: # sourcery skip: swap-if-expression if isinstance(other, WeakCallback): return self._key == other._key return NotImplemented def _try_ref( self, obj: _T, finalize: Callable[[WeakCallback], Any] | None = None, ) -> Callable[[], _T | None]: _cb = None if finalize is None else _kill_and_finalize(self, finalize) try: return weakref.ref(obj, _cb) except TypeError: if self._on_ref_error == "raise": raise if self._on_ref_error == "warn": safe_repr = object.__repr__(obj) warn( f"failed to create weakref for {safe_repr}, returning strong ref", stacklevel=2, ) def _strong_ref() -> _T: return obj return _strong_ref def slot_repr(self) -> str: return f"{self._obj_module}.{self._obj_qualname}" @staticmethod def object_key(obj: Any) -> str: """Return a unique key for an object. This includes information about the object's type, module, and id. It has considerations for bound methods (which would otherwise have a different id for each instance). """ if hasattr(obj, "__self__"): # bound method ... don't take the id of the bound method itself. obj_id = id(obj.__self__) owner_cls = type(obj.__self__) type_name = getattr(owner_cls, "__name__", None) or "" module = getattr(owner_cls, "__module__", None) or "" method_name = getattr(obj, "__name__", None) or "" obj_name = f"{type_name}.{method_name}" else: obj_id = id(obj) module = getattr(obj, "__module__", None) or "" obj_name = getattr(obj, "__name__", None) or "" return f"{module}:{obj_name}@{hex(obj_id)}" @staticmethod def object_repr(obj: Any) -> str: """Return a human-readable repr for obj.""" module = getattr(obj, "__module__", "") if hasattr(obj, "__self__"): # bound method ... don't take the id of the bound method itself. owner_cls = type(obj.__self__) module = getattr(owner_cls, "__module__", None) or "" method_name = getattr(obj, "__name__", None) or "" if module == "builtins": return method_name type_qname = getattr(owner_cls, "__qualname__", "") return f"{module}.{type_qname}.{method_name}" elif getattr(obj, "__qualname__", ""): return f"{module}.{obj.__qualname__}" elif getattr(type(obj), "__qualname__", ""): return f"{module}.{type(obj).__qualname__}" # this line was hit in py3.7, but not afterwards. # retained as a fallback, but not covered by tests. return repr(obj) # pragma: no cover def __repr__(self) -> str: return f"<{self.__class__.__name__} on {self._object_repr}>" # pragma: no cover def _kill_and_finalize( wcb: WeakCallback, finalize: Callable[[WeakCallback], Any] ) -> Callable[[weakref.ReferenceType], None]: def _cb(_: weakref.ReferenceType) -> None: if wcb._alive: wcb._alive = False finalize(wcb) return _cb @mypyc_attr(serializable=True) class StrongFunction(WeakCallback): """Wrapper around a strong function reference.""" _f: Callable def __init__( self, obj: Callable, max_args: int | None = None, args: tuple[Any, ...] = (), kwargs: dict[str, Any] | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: super().__init__(obj, max_args, on_ref_error, priority) self._f = obj self._args = args self._kwargs = kwargs or {} if args: self._object_repr = f"{self._object_repr}{(*args,)!r}".replace(")", " ...)") def cb(self, args: tuple[Any, ...] = ()) -> None: if self._max_args is not None: args = args[: self._max_args] self._f(*self._args, *args, **self._kwargs) def dereference(self) -> Callable: if self._args or self._kwargs: return partial(self._f, *self._args, **self._kwargs) return self._f def __getstate__(self) -> dict[str, Any]: atr = ("_key", "_max_args", "_alive", "_on_ref_error", "_f", "_args", "_kwargs") return {k: getattr(self, k) for k in atr} def __setstate__(self, state: dict) -> None: for k, v in state.items(): setattr(self, k, v) class WeakFunction(WeakCallback): """Wrapper around a weak function reference.""" def __init__( self, obj: Callable, max_args: int | None = None, args: tuple[Any, ...] = (), kwargs: dict[str, Any] | None = None, finalize: Callable | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: super().__init__(obj, max_args, on_ref_error, priority) self._f = self._try_ref(obj, finalize) self._args = args self._kwargs = kwargs or {} if args: self._object_repr = f"{self._object_repr}{(*args,)!r}".replace(")", " ...)") def cb(self, args: tuple[Any, ...] = ()) -> None: f = self._f() if f is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is not None: args = args[: self._max_args] f(*self._args, *args, **self._kwargs) def dereference(self) -> Callable | None: f = self._f() if f is None: return None if self._args or self._kwargs: return partial(f, *self._args, **self._kwargs) return f class WeakMethod(WeakCallback): """Wrapper around a method bound to a weakly-referenced object. Bound methods have a `__self__` attribute that holds a strong reference to the object they are bound to and a `__func__` attribute that holds a reference to the function that implements the method (on the class level) When `cb` is called here, it dereferences the two, and calls: `obj.__func__(obj.__self__, *args, **kwargs)` """ def __init__( self, obj: MethodType, max_args: int | None = None, args: tuple[Any, ...] = (), kwargs: dict[str, Any] | None = None, finalize: Callable | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: super().__init__(obj, max_args, on_ref_error, priority) self._obj_ref = self._try_ref(obj.__self__, finalize) self._func_ref = self._try_ref(obj.__func__, finalize) self._args = args self._kwargs = kwargs or {} if args: self._object_repr = f"{self._object_repr}{(*args,)!r}".replace(")", " ...)") def slot_repr(self) -> str: obj = self._obj_ref() func_name = getattr(self._func_ref(), "__name__", "") return f"{self._obj_module}.{obj.__class__.__qualname__}.{func_name}" def cb(self, args: tuple[Any, ...] = ()) -> None: obj = self._obj_ref() func = self._func_ref() if obj is None or func is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is not None: args = args[: self._max_args] func(obj, *self._args, *args, **self._kwargs) def dereference(self) -> MethodType | partial | None: obj = self._obj_ref() func = self._func_ref() if obj is None or func is None: return None method = cast("MethodType", func.__get__(obj)) if self._args or self._kwargs: return partial(method, *self._args, **self._kwargs) return method class WeakBuiltin(WeakCallback): """Wrapper around a c-based method on a weakly-referenced object. Builtin/extension methods do have a `__self__` attribute (the object to which they are bound), but don't have a __func__ attribute, so we need to store the name of the method and look it up on the object when the callback is called. When `cb` is called here, it dereferences the object, and calls: `getattr(obj.__self__, obj.__name__)(*args, **kwargs)` """ def __init__( self, obj: MethodWrapperType | BuiltinMethodType, max_args: int | None = None, args: tuple[Any, ...] = (), finalize: Callable | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: super().__init__(obj, max_args, on_ref_error, priority) self._obj_ref = self._try_ref(obj.__self__, finalize) self._func_name = obj.__name__ self._args = args if args: self._object_repr = f"{self._object_repr}{(*args,)!r}".replace(")", " ...)") def slot_repr(self) -> str: obj = self._obj_ref() return f"{obj.__class__.__qualname__}.{self._func_name}" def cb(self, args: tuple[Any, ...] = ()) -> None: func = getattr(self._obj_ref(), self._func_name, None) if func is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is None: func(*self._args, *args) else: func(*self._args, *args[: self._max_args]) def dereference(self) -> MethodWrapperType | BuiltinMethodType | None: return getattr(self._obj_ref(), self._func_name, None) class WeakSetattr(WeakCallback): """Caller to set an attribute on a weakly-referenced object.""" def __init__( self, obj: object, attr: str, max_args: int | None = None, finalize: Callable | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: super().__init__(obj, max_args, on_ref_error, priority) self._key += f".__setattr__({attr!r})" self._obj_ref = self._try_ref(obj, finalize) self._attr = attr self._object_repr += f".__setattr__({attr!r}, ...)" def slot_repr(self) -> str: obj = self._obj_ref() return f"setattr({obj.__class__.__qualname__}, {self._attr!r}, ...)" def cb(self, args: tuple[Any, ...] = ()) -> None: obj = self._obj_ref() if obj is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is not None: args = args[: self._max_args] setattr(obj, self._attr, args[0] if len(args) == 1 else args) def dereference(self) -> partial | None: obj = self._obj_ref() return None if obj is None else partial(setattr, obj, self._attr) class SupportsSetitem(Protocol): def __setitem__(self, key: Any, value: Any) -> None: ... class WeakSetitem(WeakCallback): """Caller to call __setitem__ on a weakly-referenced object.""" def __init__( self, obj: SupportsSetitem, key: Any, max_args: int | None = None, finalize: Callable | None = None, on_ref_error: RefErrorChoice = "warn", priority: int = 0, ) -> None: super().__init__(obj, max_args, on_ref_error, priority) self._key += f".__setitem__({key!r})" self._obj_ref = self._try_ref(obj, finalize) self._itemkey = key self._object_repr += f".__setitem__({key!r}, ...)" def slot_repr(self) -> str: obj = self._obj_ref() return f"{obj.__class__.__qualname__}.__setitem__({self._itemkey!r}, ...)" def cb(self, args: tuple[Any, ...] = ()) -> None: obj = self._obj_ref() if obj is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is not None: args = args[: self._max_args] obj[self._itemkey] = args[0] if len(args) == 1 else args def dereference(self) -> partial | None: obj = self._obj_ref() return None if obj is None else partial(obj.__setitem__, self._itemkey) # --------------------------- Coroutines --------------------------- class WeakCoroutineFunction(WeakFunction): def cb(self, args: tuple[Any, ...] = ()) -> None: if self._f() is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is not None: args = args[: self._max_args] cast("_AsyncBackend", get_async_backend()).put((self, args)) class StrongCoroutineFunction(StrongFunction): """Wrapper around a strong coroutine function reference.""" def cb(self, args: tuple[Any, ...] = ()) -> None: if self._max_args is not None: args = args[: self._max_args] cast("_AsyncBackend", get_async_backend()).put((self, args)) class WeakCoroutineMethod(WeakMethod): def cb(self, args: tuple[Any, ...] = ()) -> None: if self._obj_ref() is None or self._func_ref() is None: raise ReferenceError("weakly-referenced object no longer exists") if self._max_args is not None: args = args[: self._max_args] cast("_AsyncBackend", get_async_backend()).put((self, args)) psygnal-0.15.0/src/psygnal/py.typed0000644000000000000000000000000015073705675014150 0ustar00psygnal-0.15.0/src/psygnal/qt.py0000644000000000000000000000556515073705675013474 0ustar00"""Module that provides Qt-specific functionality for psygnal. This module provides convenience functions for starting and stopping a QTimer that will monitor "queued" signals and invoke their callbacks. This is useful when psygnal is used in a Qt application, and you'd like to emit signals from a thread but have their callbacks invoked in the main thread. """ from __future__ import annotations from threading import Thread, current_thread from ._queue import emit_queued try: from qtpy.QtCore import Qt, QTimer except (ImportError, RuntimeError): # pragma: no cover raise ImportError( "The psygnal.qt module requires qtpy and some Qt backend to be installed" ) from None _TIMERS: dict[Thread, QTimer] = {} def start_emitting_from_queue( msec: int = 0, timer_type: Qt.TimerType = Qt.TimerType.PreciseTimer, thread: Thread | None = None, ) -> None: """Start a QTimer that will monitor the global emission queue. If a QTimer is already running in the current thread, then this function will update the interval and timer type of that QTimer. (It is safe to call this function multiple times in the same thread.) When callbacks are connected to signals with `connect(type='queued')`, then they are not invoked immediately, but rather added to a global queue. This function starts a QTimer that will periodically check the queue and invoke any callbacks that are waiting to be invoked (in whatever thread this QTimer is running in). Parameters ---------- msec : int, optional The interval (in milliseconds) at which the QTimer will check the global emission queue. By default, the QTimer will check the queue as often as possible (i.e. 0 milliseconds). timer_type : Qt.TimerType, optional The type of timer to use. By default, Qt.PreciseTimer is used, which is the most accurate timer available on the system. thread : Thread, optional The thread in which to start the QTimer. By default, the QTimer will be started in the thread from which this function is called. """ _thread = current_thread() if thread is None else thread if _thread not in _TIMERS: _TIMERS[_thread] = QTimer() _TIMERS[_thread].timeout.connect(emit_queued) _TIMERS[_thread].setTimerType(timer_type) if _TIMERS[_thread].isActive(): _TIMERS[_thread].setInterval(msec) else: _TIMERS[_thread].start(msec) def stop_emitting_from_queue(thread: Thread | None = None) -> None: """Stop the QTimer that monitors the global emission queue. thread : Thread, optional The thread in which to stop the QTimer. By default, will stop any QTimers in the thread from which this function is called. """ _thread = current_thread() if thread is None else thread if (timer := _TIMERS.get(_thread)) is not None: timer.stop() psygnal-0.15.0/src/psygnal/testing.py0000644000000000000000000003041415073705675014514 0ustar00"""Utilities for testing psygnal Signals.""" from __future__ import annotations from contextlib import contextmanager from typing import TYPE_CHECKING from unittest.mock import Mock from unittest.util import safe_repr from psygnal import SignalGroup, SignalInstance if TYPE_CHECKING: from collections.abc import Iterator from threading import Thread from typing import Any, Literal from typing_extensions import Self, TypedDict class ConnectKwargs(TypedDict, total=False): """Kwargs for SignalInstance.connect.""" thread: Thread | Literal["main", "current"] | None check_nargs: bool | None check_types: bool | None unique: bool | Literal["raise"] max_args: int | None on_ref_error: Literal["raise", "warn", "ignore"] priority: int __all__ = [ "SignalTester", "assert_emitted", "assert_emitted_once", "assert_emitted_once_with", "assert_emitted_with", "assert_ever_emitted_with", "assert_not_emitted", ] class SignalTester: """A tester object that listens to a signal and records its emissions. This class wraps a [`SignalInstance`][psygnal.SignalInstance] and a [`unittest.mock.Mock`][] object. It provides methods to connect and disconnect the mock from the signal, and to assert that the signal was emitted with the expected arguments. It also behaves as a **context manager**, so you can monitor emissions of a signal within a specific context. !!! important The signal is *not* automatically connected to the mock when the SignalTester is created. You must call [`connect()`][psygnal.testing.SignalTester.connect] or use the context manager to connect the mock to the signal. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Attributes ---------- signal_instance : SignalInstance The signal instance being tested. If a `SignalGroup` is passed, it uses the `_psygnal_relay` attribute to get the underlying `SignalInstance`. mock : unittest.mock.Mock The mock object that will be connected to the signal. Examples -------- ```python from psygnal import Signal from psygnal.testing import SignalTester class MyObject: value_changed = Signal(int) obj = MyObject() tester = SignalTester(obj.value_changed) tester.assert_not_emitted() with tester: obj.value_changed.emit(1) tester.assert_emitted() tester.assert_emitted_once() tester.assert_emitted_once_with(1) assert tester.emit_count == 1 tester.reset() assert tester.emit_count == 0 ``` """ def __init__( self, signal: SignalInstance | SignalGroup, connect_kwargs: ConnectKwargs | None = None, ) -> None: super().__init__() self.mock = Mock() if isinstance(signal, SignalGroup): signal_instance: SignalInstance = signal._psygnal_relay else: signal_instance = signal self.signal_instance: SignalInstance = signal_instance self.connect_kwargs = connect_kwargs or {} def reset(self) -> None: """Reset the underlying mock object.""" self.mock.reset_mock() @property def emit_count(self) -> int: """Return the number of times the signal was emitted.""" return self.mock.call_count @property def emit_args(self) -> tuple[Any, ...]: """Return the arguments of the last emission of the signal.""" if (call_args := self.mock.call_args) is None: return () return call_args[0] # type: ignore[no-any-return] @property def emit_args_list(self) -> list[tuple[Any, ...]]: """Return the arguments of all emissions of the signal.""" return [call[0] for call in self.mock.call_args_list] def connect(self) -> None: """Connect the mock to the signal.""" self.signal_instance.connect(self.mock, **self.connect_kwargs) def disconnect(self) -> None: """Disconnect the mock from the signal.""" self.signal_instance.disconnect(self.mock) def __enter__(self) -> Self: """Connect the mock to the signal.""" self.connect() return self def __exit__(self, *args: Any) -> None: """Disconnect the mock from the signal.""" self.disconnect() @property def signal_name(self) -> str: """Return the name of the signal.""" return self.signal_instance.name or "signal" def assert_not_emitted(self) -> None: """Assert that the signal was never emitted.""" if self.mock.call_count != 0: if self.mock.call_count == 1: n = "once" else: n = f"{self.mock.call_count} times" raise AssertionError( f"Expected {self.signal_name!r} to not have been emitted. Emitted {n}." ) def assert_emitted(self) -> None: """Assert that the signal was emitted at least once.""" if self.mock.call_count == 0: raise AssertionError(f"Expected {self.signal_name!r} to have been emitted.") def assert_emitted_once(self) -> None: """Assert that the signal was emitted exactly once.""" if not self.mock.call_count == 1: raise AssertionError( f"Expected {self.signal_name!r} to have been emitted once. " f"Emitted {self.mock.call_count} times." ) def assert_emitted_with(self, /, *args: Any) -> None: """Assert that the *last* emission of the signal had the given arguments.""" if self.mock.call_args is None: raise AssertionError( f"Expected {self.signal_name!r} to have been emitted with arguments " f"{args!r}.\nActual: not emitted" ) actual = self.mock.call_args[0] if actual != args: raise AssertionError( f"Expected {self.signal_name!r} to have been emitted with arguments " f"{args!r}.\nActual: {actual}" ) def assert_emitted_once_with(self, /, *args: Any) -> None: """Assert that the signal was emitted exactly once with the given arguments.""" if not self.mock.call_count == 1: raise AssertionError( f"Expected {self.signal_name!r} to have been emitted exactly once. " f"Emitted {self.mock.call_count} times." ) actual = self.mock.call_args[0] if actual != args: raise AssertionError( f"Expected {self.signal_name!r} to have been emitted once with " f"arguments {args!r}.\nActual: {safe_repr(actual)}" ) def assert_ever_emitted_with(self, /, *args: Any) -> None: """Assert that the signal was emitted *ever* with the given arguments.""" if self.mock.call_args is None: raise AssertionError( f"Expected {self.signal_name!r} to have been emitted at least once " f"with arguments {args!r}.\nActual: not emitted" ) actual = [call[0] for call in self.mock.call_args_list] if not any(call == args for call in actual): _actual: tuple | list = actual[0] if len(actual) == 1 else actual raise AssertionError( f"Expected {self.signal_name!r} to have been emitted at least once " f"with arguments {args!r}.\nActual: {safe_repr(_actual)}" ) @contextmanager def assert_emitted( signal: SignalInstance | SignalGroup, connect_kwargs: ConnectKwargs | None = None ) -> Iterator[SignalTester]: """Assert that a signal was emitted at least once. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Raises ------ AssertionError If the signal was never emitted. """ with SignalTester(signal, connect_kwargs) as mock: yield mock mock.assert_emitted() @contextmanager def assert_emitted_once( signal: SignalInstance | SignalGroup, connect_kwargs: ConnectKwargs | None = None ) -> Iterator[SignalTester]: """Assert that a signal was emitted exactly once. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Raises ------ AssertionError If the signal was emitted more than once. """ with SignalTester(signal, connect_kwargs) as mock: yield mock mock.assert_emitted_once() @contextmanager def assert_not_emitted( signal: SignalInstance | SignalGroup, connect_kwargs: ConnectKwargs | None = None ) -> Iterator[SignalTester]: """Assert that a signal was never emitted. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Raises ------ AssertionError If the signal was emitted at least once. """ with SignalTester(signal, connect_kwargs) as mock: yield mock mock.assert_not_emitted() @contextmanager def assert_emitted_with( signal: SignalInstance | SignalGroup, *args: Any, connect_kwargs: ConnectKwargs | None = None, ) -> Iterator[SignalTester]: """Assert that the *last* emission of the signal had the given arguments. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. *args : Any The arguments to check for in the last emission of the signal. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Raises ------ AssertionError If the signal was never emitted or if the last emission did not have the expected arguments. """ with assert_emitted(signal, connect_kwargs) as mock: yield mock mock.assert_emitted_with(*args) @contextmanager def assert_emitted_once_with( signal: SignalInstance | SignalGroup, *args: Any, connect_kwargs: ConnectKwargs | None = None, ) -> Iterator[SignalTester]: """Assert that the signal was emitted exactly once with the given arguments. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. *args : Any The arguments to check for in the last emission of the signal. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Raises ------ AssertionError If the signal was not emitted or was emitted more than once or if the last emission did not have the expected arguments. """ with assert_emitted_once(signal, connect_kwargs) as mock: yield mock mock.assert_emitted_once_with(*args) @contextmanager def assert_ever_emitted_with( signal: SignalInstance | SignalGroup, *args: Any, connect_kwargs: ConnectKwargs | None = None, ) -> Iterator[SignalTester]: """Assert that the signal was emitted *ever* with the given arguments. Parameters ---------- signal : SignalInstance | SignalGroup The signal instance or group to test. *args : Any The arguments to check for in any emission of the signal. connect_kwargs : ConnectKwargs Keyword arguments to pass to the [`SignalInstance.connect()`][psygnal.SignalInstance.connect] method. Raises ------ AssertionError If the signal was never emitted or if it was emitted but not with the expected arguments. """ with assert_emitted(signal, connect_kwargs) as mock: yield mock mock.assert_ever_emitted_with(*args) psygnal-0.15.0/src/psygnal/utils.py0000644000000000000000000001133315073705675014176 0ustar00"""These utilities may help when using signals and evented objects.""" from __future__ import annotations from contextlib import contextmanager, suppress from functools import partial from pathlib import Path from typing import TYPE_CHECKING, Any from warnings import warn from ._group import EmissionInfo, SignalGroup from ._signal import SignalInstance if TYPE_CHECKING: from collections.abc import Callable, Generator, Iterator __all__ = ["iter_signal_instances", "monitor_events"] def _default_event_monitor(info: EmissionInfo) -> None: print(f"{info.signal.name}.emit{info.args!r}") @contextmanager def monitor_events( obj: Any | None = None, logger: Callable[[EmissionInfo], Any] = _default_event_monitor, include_private_attrs: bool = False, ) -> Iterator[None]: """Context manager to print or collect events emitted by SignalInstances on `obj`. Parameters ---------- obj : object, optional Any object that has an attribute that has a SignalInstance (or SignalGroup). If None, all SignalInstances will be monitored. logger : Callable[[EmissionInfo], None], optional A optional function to handle the logging of the event emission. This function must take two positional args: a signal name string, and a tuple that contains the emitted arguments. The default logger simply prints the signal name and emitted args. include_private_attrs : bool Whether private signals (starting with an underscore) should also be logged, by default False """ code = getattr(logger, "__code__", None) _old_api = bool(code and code.co_argcount > 1) if obj is None: # install the hook globally if _old_api: raise ValueError( "logger function must take a single argument (an EmissionInfo instance)" ) before, SignalInstance._debug_hook = SignalInstance._debug_hook, logger else: if _old_api: warn( "logger functions must now take a single argument (an instance of " "psygnal.EmissionInfo). Please update your logger function.", stacklevel=2, ) disconnectors = set() for siginst in iter_signal_instances(obj, include_private_attrs): if _old_api: def _report(*args: Any, signal: SignalInstance = siginst) -> None: logger(signal.name, args) # type: ignore else: def _report(*args: Any, signal: SignalInstance = siginst) -> None: logger(EmissionInfo(signal, args)) disconnectors.add(partial(siginst.disconnect, siginst.connect(_report))) try: yield finally: if obj is None: SignalInstance._debug_hook = before else: for disconnector in disconnectors: disconnector() def iter_signal_instances( obj: Any, include_private_attrs: bool = False ) -> Generator[SignalInstance, None, None]: """Yield all `SignalInstance` attributes found on `obj`. Parameters ---------- obj : object Any object that has an attribute that has a SignalInstance (or SignalGroup). include_private_attrs : bool Whether private signals (starting with an underscore) should also be logged, by default False Yields ------ SignalInstance SignalInstances (and SignalGroups) found as attributes on `obj`. """ # SignalGroup if isinstance(obj, SignalGroup): for sig in obj: yield obj[sig] return # Signal attached to Class for n in dir(obj): if not include_private_attrs and n.startswith("_"): continue with suppress(Exception): # if we can't access the attribute, skip it attr = getattr(obj, n) if isinstance(attr, SignalInstance): yield attr if isinstance(attr, SignalGroup): yield attr._psygnal_relay _COMPILED_EXTS = (".so", ".pyd") _BAK = "_BAK" def decompile() -> None: """Mangle names of mypyc-compiled files so that they aren't used. This function requires write permissions to the psygnal source directory. """ for suffix in _COMPILED_EXTS: # pragma: no cover for path in Path(__file__).parent.rglob(f"**/*{suffix}"): path.rename(path.with_suffix(f"{suffix}{_BAK}")) def recompile() -> None: """Fix all name-mangled mypyc-compiled files so that they ARE used. This function requires write permissions to the psygnal source directory. """ for suffix in _COMPILED_EXTS: # pragma: no cover for path in Path(__file__).parent.rglob(f"**/*{suffix}{_BAK}"): path.rename(path.with_suffix(suffix)) psygnal-0.15.0/src/psygnal/_pyinstaller_util/__init__.py0000644000000000000000000000000015073705675020324 0ustar00psygnal-0.15.0/src/psygnal/_pyinstaller_util/_pyinstaller_hook.py0000644000000000000000000000020015073705675022314 0ustar00from pathlib import Path CURRENT_DIR = Path(__file__).parent def get_hook_dirs() -> list[str]: return [str(CURRENT_DIR)] psygnal-0.15.0/src/psygnal/_pyinstaller_util/hook-psygnal.py0000644000000000000000000000234415073705675021215 0ustar00from collections.abc import Iterable from importlib.metadata import PackageNotFoundError, PackagePath from importlib.metadata import files as package_files from pathlib import Path try: import psygnal PSYGNAL_DIR = Path(psygnal.__file__).parent except ImportError: PSYGNAL_DIR = Path(__file__).parent.parent def binary_files(file_list: Iterable[PackagePath | Path]) -> list[Path]: return [Path(file) for file in file_list if file.suffix in {".so", ".pyd"}] def create_hiddenimports() -> list[str]: res = ["queue", "mypy_extensions", "__future__"] try: files_list = package_files("psygnal") except PackageNotFoundError: return res if files_list is None: return res modules = binary_files(files_list) if len(modules) < 2: # This is a workaround for a bug in importlib.metadata in editable mode src_path = PSYGNAL_DIR.parent modules = [ x.relative_to(src_path) for x in binary_files(PSYGNAL_DIR.iterdir()) + binary_files(src_path.iterdir()) ] for module in modules: res.append(str(module).split(".")[0].replace("/", ".").replace("\\", ".")) return res hiddenimports = create_hiddenimports() psygnal-0.15.0/src/psygnal/containers/__init__.py0000644000000000000000000000351215073705675016742 0ustar00"""Containers backed by psygnal events. These classes provide "evented" versions of mutable python containers. They each have an `events` attribute (`SignalGroup`) that has a variety of signals that will emit whenever the container is mutated. See [Container SignalGroups](#container-signalgroups) for the corresponding container type for details on the available signals. """ from typing import TYPE_CHECKING, Any from ._evented_dict import DictEvents, EventedDict from ._evented_list import EventedList, ListEvents from ._evented_set import EventedOrderedSet, EventedSet, OrderedSet, SetEvents from ._selectable_evented_list import SelectableEventedList from ._selection import Selection if TYPE_CHECKING: from ._evented_proxy import ( CallableProxyEvents, EventedCallableObjectProxy, EventedObjectProxy, ProxyEvents, ) __all__ = [ "CallableProxyEvents", "DictEvents", "EventedCallableObjectProxy", "EventedDict", "EventedList", "EventedObjectProxy", "EventedOrderedSet", "EventedSet", "ListEvents", "OrderedSet", "ProxyEvents", "SelectableEventedList", "Selection", "SetEvents", ] def __getattr__(name: str) -> Any: # pragma: no cover if name == "EventedObjectProxy": from ._evented_proxy import EventedObjectProxy return EventedObjectProxy if name == "EventedCallableObjectProxy": from ._evented_proxy import EventedCallableObjectProxy return EventedCallableObjectProxy if name == "CallableProxyEvents": from ._evented_proxy import CallableProxyEvents return CallableProxyEvents if name == "ProxyEvents": from ._evented_proxy import ProxyEvents return ProxyEvents raise AttributeError( # pragma: no cover f"module {__name__!r} has no attribute {name!r}" ) psygnal-0.15.0/src/psygnal/containers/_evented_dict.py0000644000000000000000000001670315073705675020005 0ustar00"""Dict that emits events when altered.""" from __future__ import annotations from collections.abc import Iterable, Iterator, Mapping, MutableMapping, Sequence from functools import partial from typing import TYPE_CHECKING, Any, ClassVar, TypeAlias, TypeVar, get_args if TYPE_CHECKING: from pydantic import GetCoreSchemaHandler, SerializationInfo from typing_extensions import Self from psygnal._group import EmissionInfo, PathStep, SignalGroup from psygnal._signal import Signal, SignalInstance _K = TypeVar("_K") _V = TypeVar("_V") TypeOrSequenceOfTypes: TypeAlias = type[_V] | Sequence[type[_V]] DictArg: TypeAlias = Mapping[_K, _V] | Iterable[tuple[_K, _V]] class TypedMutableMapping(MutableMapping[_K, _V]): """Dictionary that enforces value type. Parameters ---------- data : Union[Mapping[_K, _V], Iterable[Tuple[_K, _V]], None], optional Data suitable of passing to dict(). Mapping of {key: value} pairs, or Iterable of two-tuples [(key, value), ...], or None to create an basetype : TypeOrSequenceOfTypes, optional Type or Sequence of Type objects. If provided, values entered into this Mapping must be an instance of one of the provided types. by default () """ def __init__( self, data: DictArg | None = None, *, basetype: TypeOrSequenceOfTypes = (), **kwargs: _V, ): self._dict: dict[_K, _V] = {} self._basetypes: tuple[type[_V], ...] = ( tuple(basetype) if isinstance(basetype, Sequence) else (basetype,) ) self.update({} if data is None else data, **kwargs) def __setitem__(self, key: _K, value: _V) -> None: self._dict[key] = self._type_check(value) def __delitem__(self, key: _K) -> None: del self._dict[key] def __getitem__(self, key: _K) -> _V: return self._dict[key] def __len__(self) -> int: return len(self._dict) def __iter__(self) -> Iterator[_K]: return iter(self._dict) def __repr__(self) -> str: return repr(self._dict) def _type_check(self, value: _V) -> _V: """Check the types of items if basetypes are set for the model.""" if self._basetypes and not any(isinstance(value, t) for t in self._basetypes): raise TypeError( f"Cannot add object with type {type(value)} to TypedDict expecting" f"type {self._basetypes}" ) return value def __newlike__(self, mapping: MutableMapping[_K, _V]) -> Self: new = self.__class__() # separating this allows subclasses to omit these from their `__init__` new._basetypes = self._basetypes new.update(mapping) return new def copy(self) -> Self: """Return a shallow copy of the dictionary.""" return self.__newlike__(self) def __copy__(self) -> Self: return self.copy() # PYDANTIC SUPPORT @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> Mapping[str, Any]: """Return the Pydantic core schema for this object.""" from pydantic_core import core_schema def _serialize(obj: EventedDict[_K, _V], info: SerializationInfo, /) -> Any: if info.mode_is_json(): return obj._dict return cls(obj._dict) # get key/value types key_type = val_type = Any if args := get_args(source_type): key_type = args[0] if len(args) > 1: val_type = args[1] # get key/value schemas and validators keys_schema = handler.generate_schema(key_type) values_schema = handler.generate_schema(val_type) dict_schema = core_schema.dict_schema( keys_schema=keys_schema, values_schema=values_schema, ) return core_schema.no_info_after_validator_function( function=cls, schema=dict_schema, json_schema_input_schema=dict_schema, serialization=core_schema.plain_serializer_function_ser_schema( _serialize, info_arg=True, ), ) class DictSignalInstance(SignalInstance): def _psygnal_relocate_info_(self, emission_info: EmissionInfo) -> EmissionInfo: """Relocate the emission info to the key being modified. (All signals on EventedDict have the key as the first argument.) """ if args := emission_info.args: return emission_info.insert_path(PathStep(key=args[0])) return emission_info DictSignal = partial(Signal, signal_instance_class=DictSignalInstance) class DictEvents(SignalGroup): """Events available on [EventedDict][psygnal.containers.EventedDict].""" adding = DictSignal(object) # (key, ) """`(key,)` emitted before an item is added at `key`""" added = DictSignal(object, object) # (key, value) """`(key, value)` emitted after a `value` is added at `key`""" changing = DictSignal(object) # (key, ) """`(key, old_value, new_value)` emitted before `old_value` is replaced with `new_value` at `key`""" changed = DictSignal(object, object, object) # (key, old_value, value) """`(key, old_value, new_value)` emitted before `old_value` is replaced with `new_value` at `key`""" removing = DictSignal(object) # (key, ) """`(key,)` emitted before an item is removed at `key`""" removed = DictSignal(object, object) # (key, value) """`(key, value)` emitted after `value` is removed at `key`""" class EventedDict(TypedMutableMapping[_K, _V]): """Mutable mapping that emits events when altered. This class is designed to behave exactly like the builtin [`dict`][], but will emit events before and after all mutations (addition, removal, and changing). Parameters ---------- data : Union[Mapping[_K, _V], Iterable[Tuple[_K, _V]], None], optional Data suitable of passing to dict(). Mapping of {key: value} pairs, or Iterable of two-tuples [(key, value), ...], or None to create an basetype : TypeOrSequenceOfTypes, optional Type or Sequence of Type objects. If provided, values entered into this Mapping must be an instance of one of the provided types. by default (). Attributes ---------- events: DictEvents The `SignalGroup` object that emits all events available on an `EventedDict`. """ events: DictEvents # pragma: no cover _psygnal_group_: ClassVar[str] = "events" def __init__( self, data: DictArg | None = None, *, basetype: TypeOrSequenceOfTypes = (), **kwargs: _V, ): self.events = DictEvents() super().__init__(data, basetype=basetype, **kwargs) def __setitem__(self, key: _K, value: _V) -> None: if key not in self._dict: self.events.adding.emit(key) super().__setitem__(key, value) self.events.added.emit(key, value) else: old_value = self._dict[key] if value is not old_value: self.events.changing.emit(key) super().__setitem__(key, value) self.events.changed.emit(key, old_value, value) def __delitem__(self, key: _K) -> None: item = self._dict[key] self.events.removing.emit(key) super().__delitem__(key) self.events.removed.emit(key, item) def __repr__(self) -> str: return f"{self.__class__.__name__}({super().__repr__()})" psygnal-0.15.0/src/psygnal/containers/_evented_list.py0000644000000000000000000004112015073705675020024 0ustar00"""MutableSequence that emits events when altered. Note For Developers =================== Be cautious when re-implementing typical list-like methods here (e.g. extend, pop, clear, etc...). By not re-implementing those methods, we force ALL "CRUD" (create, read, update, delete) operations to go through a few key methods defined by the abc.MutableSequence interface, where we can emit the necessary events. Specifically: - `insert` = "create" : add a new item/index to the list - `__getitem__` = "read" : get the value of an existing index - `__setitem__` = "update" : update the value of an existing index - `__delitem__` = "delete" : remove an existing index from the list All of the additional list-like methods are provided by the MutableSequence interface, and call one of those 4 methods. So if you override a method, you MUST make sure that all the appropriate events are emitted. (Tests should cover this in test_evented_list.py) """ from __future__ import annotations from collections.abc import Iterable, Mapping, MutableSequence from functools import partial from typing import ( TYPE_CHECKING, Any, ClassVar, TypeAlias, TypeVar, cast, get_args, overload, ) from psygnal._group import EmissionInfo, PathStep, SignalGroup from psygnal._signal import Signal, SignalInstance from psygnal.utils import iter_signal_instances if TYPE_CHECKING: from pydantic import GetCoreSchemaHandler, SerializationInfo from typing_extensions import Self _T = TypeVar("_T") Index: TypeAlias = int | slice class ListSignalInstance(SignalInstance): def _psygnal_relocate_info_(self, emission_info: EmissionInfo) -> EmissionInfo: """Relocate the emission info to the index being modified. (All signals on EventedList have the index as the first argument.) """ if args := emission_info.args: return emission_info.insert_path(PathStep(index=args[0])) return emission_info ListSignal = partial(Signal, signal_instance_class=ListSignalInstance) class ListEvents(SignalGroup): """Events available on [EventedList][psygnal.containers.EventedList].""" inserting = ListSignal(int) """`(index)` emitted before an item is inserted at `index`""" inserted = ListSignal(int, object) """`(index, value)` emitted after `value` is inserted at `index`""" removing = ListSignal(int) """`(index)` emitted before an item is removed at `index`""" removed = ListSignal(int, object) """`(index, value)` emitted after `value` is removed at `index`""" moving = ListSignal(int, int) """`(index, new_index)` emitted before an item is moved from `index` to `new_index`""" moved = ListSignal(int, int, object) """`(index, new_index, value)` emitted after `value` is moved from `index` to `new_index`""" changed = ListSignal(object, object, object) """`(index_or_slice, old_value, value)` emitted when `index` is set from `old_value` to `value`""" reordered = Signal() """Emitted when the list is reordered (eg. moved/reversed).""" child_event = Signal(EmissionInfo) """`(EmissionInfo)` emitted when an object in the list emits an event. Note that the `EventedList` must be created with `child_events=True` in order for this to be emitted. """ class EventedList(MutableSequence[_T]): """Mutable Sequence that emits events when altered. This class is designed to behave exactly like the builtin `list`, but will emit events before and after all mutations (insertion, removal, setting, and moving). Parameters ---------- data : iterable, optional Elements to initialize the list with. hashable : bool Whether the list should be hashable as id(self). By default `True`. child_events: bool Whether to re-emit events from emitted from evented items in the list (i.e. items that have SignalInstances). If `True`, child events can be connected at `EventedList.events.child_event`. By default, `True`. Attributes ---------- events : ListEvents SignalGroup that with events related to list mutation. (see ListEvents) """ events: ListEvents # pragma: no cover _psygnal_group_: ClassVar[str] = "events" def __init__( self, data: Iterable[_T] = (), *, hashable: bool = True, child_events: bool = True, ): super().__init__() self._data: list[_T] = [] self._hashable = hashable self._child_events = child_events self.events = ListEvents(instance=self) self.extend(data) # WAIT!! ... Read the module docstring before reimplement these methods # def append(self, item): ... # def clear(self): ... # def pop(self, index=-1): ... # def extend(self, value: Iterable[_T]): ... # def remove(self, value: Any): ... def insert(self, index: int, value: _T) -> None: """Insert `value` before index.""" _value = self._pre_insert(value) self.events.inserting.emit(index) self._data.insert(index, _value) self.events.inserted.emit(index, value) self._post_insert(value) @overload def __getitem__(self, key: int) -> _T: ... @overload def __getitem__(self, key: slice) -> Self: ... def __getitem__(self, key: Index) -> _T | Self: """Return self[key].""" result = self._data[key] return self.__newlike__(result) if isinstance(result, list) else result @overload def __setitem__(self, key: int, value: _T) -> None: ... @overload def __setitem__(self, key: slice, value: Iterable[_T]) -> None: ... def __setitem__(self, key: Index, value: _T | Iterable[_T]) -> None: """Set self[key] to value.""" old = self._data[key] if value is old: return # sourcery skip: hoist-similar-statement-from-if, hoist-statement-from-if if isinstance(key, slice): if not isinstance(value, Iterable): raise TypeError("Can only assign an iterable to slice") value = [self._pre_insert(v) for v in value] # before we mutate the list self._data[key] = value else: value = self._pre_insert(cast("_T", value)) self._data[key] = value self.events.changed.emit(key, old, value) def __delitem__(self, key: Index) -> None: """Delete self[key].""" # delete from the end for parent, index in sorted(self._delitem_indices(key), reverse=True): parent.events.removing.emit(index) parent._pre_remove(index) item = parent._data.pop(index) self.events.removed.emit(index, item) def _delitem_indices(self, key: Index) -> Iterable[tuple[EventedList[_T], int]]: # returning (self, int) allows subclasses to pass nested members if isinstance(key, int): yield (self, key if key >= 0 else key + len(self)) elif isinstance(key, slice): yield from ((self, i) for i in range(*key.indices(len(self)))) else: n = repr(type(key).__name__) raise TypeError(f"EventedList indices must be integers or slices, not {n}") def _pre_insert(self, value: _T) -> _T: """Validate and or modify values prior to inserted.""" return value def _post_insert(self, new_item: _T) -> None: """Modify and or handle values after insertion.""" if self._child_events: self._connect_child_emitters(new_item) def _pre_remove(self, index: int) -> None: """Modify and or handle values before removal.""" if self._child_events: self._disconnect_child_emitters(self[index]) def __newlike__(self, iterable: Iterable[_T]) -> Self: """Return new instance of same class.""" return self.__class__(iterable) def copy(self) -> Self: """Return a shallow copy of the list.""" return self.__newlike__(self) def __copy__(self) -> Self: return self.copy() def __add__(self, other: Iterable[_T]) -> Self: """Add other to self, return new object.""" copy = self.copy() copy.extend(other) return copy def __iadd__(self, other: Iterable[_T]) -> Self: """Add other to self in place (self += other).""" self.extend(other) return self def __radd__(self, other: list) -> list: """Reflected add (other + self). Cast self to list.""" return other + list(self) def __len__(self) -> int: """Return len(self).""" return len(self._data) def __repr__(self) -> str: """Return repr(self).""" return f"{type(self).__name__}({self._data})" def __eq__(self, other: Any) -> bool: """Return self==value.""" return bool(self._data == other) def __hash__(self) -> int: """Return hash(self).""" # it's important to add this to allow this object to be hashable # given that we've also reimplemented __eq__ if self._hashable: return id(self) name = self.__class__.__name__ raise TypeError( f"unhashable type: {name!r}. " f"Create with {name}(..., hashable=True) if you need hashability" ) def reverse(self, *, emit_individual_events: bool = False) -> None: """Reverse list *IN PLACE*.""" if emit_individual_events: super().reverse() else: self._data.reverse() self.events.reordered.emit() def move(self, src_index: int, dest_index: int = 0) -> bool: """Insert object at `src_index` before `dest_index`. Both indices refer to the list prior to any object removal (pre-move space). """ if dest_index < 0: dest_index += len(self) + 1 if dest_index in (src_index, src_index + 1): # this is a no-op return False self.events.moving.emit(src_index, dest_index) item = self._data.pop(src_index) if dest_index > src_index: dest_index -= 1 self._data.insert(dest_index, item) self.events.moved.emit(src_index, dest_index, item) self.events.reordered.emit() return True def move_multiple(self, sources: Iterable[Index], dest_index: int = 0) -> int: """Move a batch of `sources` indices, to a single destination. Note, if `dest_index` is higher than any of the `sources`, then the resulting position of the moved objects after the move operation is complete will be lower than `dest_index`. Parameters ---------- sources : Iterable[Union[int, slice]] A sequence of indices dest_index : int, optional The destination index. All sources will be inserted before this index (in pre-move space), by default 0... which has the effect of "bringing to front" everything in `sources`, or acting as a "reorder" method if `sources` contains all indices. Returns ------- int The number of successful move operations completed. Raises ------ TypeError If the destination index is a slice, or any of the source indices are not `int` or `slice`. """ # calling list here makes sure that there are no index errors up front move_plan = list(self._move_plan(sources, dest_index)) # don't assume index adjacency ... so move objects one at a time # this *could* be simplified with an intermediate list ... but this way # allows any views (such as QtViews) to update themselves more easily. # If this needs to be changed in the future for performance reasons, # then the associated QtListView will need to changed from using # `beginMoveRows` & `endMoveRows` to using `layoutAboutToBeChanged` & # `layoutChanged` while *manually* updating model indices with # `changePersistentIndexList`. That becomes much harder to do with # nested tree-like models. with self.events.reordered.blocked(): for src, dest in move_plan: self.move(src, dest) self.events.reordered.emit() return len(move_plan) def _move_plan( self, sources: Iterable[Index], dest_index: int ) -> Iterable[tuple[int, int]]: """Yield prepared indices for a multi-move. Given a set of `sources` from anywhere in the list, and a single `dest_index`, this function computes and yields `(from_index, to_index)` tuples that can be used sequentially in single move operations. It keeps track of what has moved where and updates the source and destination indices to reflect the model at each point in the process. This is useful for a drag-drop operation with a QtModel/View. Parameters ---------- sources : Iterable[tuple[int, ...]] An iterable of tuple[int] that should be moved to `dest_index`. dest_index : Tuple[int] The destination for sources. """ if isinstance(dest_index, slice): raise TypeError("Destination index may not be a slice") # pragma: no cover to_move: list[int] = [] for idx in sources: if isinstance(idx, slice): to_move.extend(list(range(*idx.indices(len(self))))) elif isinstance(idx, int): to_move.append(idx) else: raise TypeError( "Can only move integer or slice indices" ) # pragma: no cover to_move = list(dict.fromkeys(to_move)) if dest_index < 0: dest_index += len(self) + 1 d_inc = 0 popped: list[int] = [] for i, src in enumerate(to_move): if src != dest_index: # we need to decrement the src_i by 1 for each time we have # previously pulled items out from in front of the src_i src -= sum(x <= src for x in popped) # if source is past the insertion point, increment src for each # previous insertion if src >= dest_index: src += i yield src, dest_index + d_inc popped.append(src) # if the item moved up, increment the destination index if dest_index <= src: d_inc += 1 def _connect_child_emitters(self, child: _T) -> None: """Connect all events from the child to be reemitted.""" for emitter in iter_signal_instances(child): emitter.connect(self._reemit_child_event) def _disconnect_child_emitters(self, child: _T) -> None: """Disconnect all events from the child from the reemitter.""" for emitter in iter_signal_instances(child): emitter.disconnect(self._reemit_child_event) def _reemit_child_event(self, *args: Any) -> None: """Re-emit event from child with index.""" emitter = Signal.current_emitter() if emitter is None: return # pragma: no cover obj = emitter.instance try: idx = self.index(obj) except ValueError: # pragma: no cover return if args and isinstance(args[0], EmissionInfo): child_info = EmissionInfo( signal=args[0].signal, args=args[0].args, path=(PathStep(index=idx), *args[0].path), ) else: child_info = EmissionInfo( signal=emitter, args=args, path=(PathStep(index=idx), PathStep(attr=emitter.name)), ) self.events.child_event.emit(child_info) # PYDANTIC SUPPORT @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> Mapping[str, Any]: """Return the Pydantic core schema for this object.""" from pydantic_core import core_schema def _serialize(obj: EventedList[_T], info: SerializationInfo, /) -> Any: if info.mode_is_json(): return obj._data return cls(obj._data) item_type = args[0] if (args := get_args(source_type)) else Any items_schema = handler.generate_schema(item_type) list_schema = core_schema.list_schema(items_schema=items_schema) return core_schema.no_info_after_validator_function( function=cls, schema=list_schema, json_schema_input_schema=list_schema, serialization=core_schema.plain_serializer_function_ser_schema( _serialize, info_arg=True, ), ) psygnal-0.15.0/src/psygnal/containers/_evented_proxy.py0000644000000000000000000001753015073705675020242 0ustar00from collections.abc import Callable from functools import partial from typing import Any, ClassVar, Generic, TypeVar from weakref import finalize try: from wrapt import ObjectProxy except ImportError as e: raise type(e)( f"{e}. Please `pip install psygnal[proxy]` to use EventedObjectProxies" ) from e from psygnal._group import SignalGroup from psygnal._signal import Signal T = TypeVar("T") _UNSET = object() class ProxyEvents(SignalGroup): """Events emitted by `EventedObjectProxy` and `EventedCallableObjectProxy`.""" attribute_set = Signal(str, object) """Emitted when an attribute is set.""" attribute_deleted = Signal(str) """Emitted when an attribute is deleted.""" item_set = Signal(object, object) """Emitted when an item is set.""" item_deleted = Signal(object) """Emitted when an item is deleted.""" in_place = Signal(str, object) """Emitted when an in-place operation is performed.""" class CallableProxyEvents(ProxyEvents): """Events emitted by `EventedCallableObjectProxy`.""" called = Signal(tuple, dict) """Emitted when the object is called.""" # we're using a cache instead of setting the events object directly on the proxy # because when wrapt is compiled as a C extensions, the ObjectProxy is not allowed # to add any new attributes. _OBJ_CACHE: dict[int, ProxyEvents] = {} class EventedObjectProxy(ObjectProxy, Generic[T]): """Create a proxy of `target` that includes an `events` [psygnal.SignalGroup][]. Provides an "evented" subclasses of [`wrapt.ObjectProxy`](https://wrapt.readthedocs.io/en/latest/wrappers.html#object-proxy) !!! important This class requires `wrapt` to be installed. You can install directly (`pip install wrapt`) or by using the psygnal extra: `pip install psygnal[proxy]` Signals will be emitted whenever an attribute is set or deleted, or (if the object implements `__getitem__`) whenever an item is set or deleted. If the object supports in-place modification (i.e. any of the `__i{}__` magic methods), then an `in_place` event is emitted (with the name of the method) whenever any of them are used. The events available at target.events include: - `attribute_set`: `Signal(str, object)` - `attribute_deleted`: `Signal(str)` - `item_set`: `Signal(object, object)` - `item_deleted`: `Signal(object)` - `in_place`: `Signal(str, object)` !!! warning "Experimental" This object is experimental! They may affect the behavior of the wrapped object in unanticipated ways. Please consult the [wrapt documentation](https://wrapt.readthedocs.io/en/latest/wrappers.html) for details on how the Object Proxy works. Parameters ---------- target : Any An object to wrap """ _psygnal_group_: ClassVar[str] = "events" def __init__(self, target: Any): super().__init__(target) @property def events(self) -> ProxyEvents: # pragma: no cover # unclear why """`SignalGroup` containing events for this object proxy.""" obj_id = id(self) if obj_id not in _OBJ_CACHE: _OBJ_CACHE[obj_id] = ProxyEvents() finalize(self, partial(_OBJ_CACHE.pop, obj_id, None)) return _OBJ_CACHE[obj_id] def __setattr__(self, name: str, value: None) -> None: before = getattr(self, name, _UNSET) super().__setattr__(name, value) if before is not (after := getattr(self, name, _UNSET)): self.events.attribute_set(name, after) def __delattr__(self, name: str) -> None: super().__delattr__(name) self.events.attribute_deleted(name) def __setitem__(self, key: Any, value: Any) -> None: before = self[key] super().__setitem__(key, value) if before is not (after := self[key]): self.events.item_set(key, after) def __delitem__(self, key: Any) -> None: super().__delitem__(key) self.events.item_deleted(key) def __repr__(self) -> str: return repr(self.__wrapped__) def __dir__(self) -> list[str]: return [*dir(self.__wrapped__), "events"] def __iadd__(self, other: Any) -> T: self.events.in_place("add", other) return super().__iadd__(other) # type: ignore def __isub__(self, other: Any) -> T: self.events.in_place("sub", other) return super().__isub__(other) # type: ignore def __imul__(self, other: Any) -> T: self.events.in_place("mul", other) return super().__imul__(other) # type: ignore def __imatmul__(self, other: Any) -> T: self.events.in_place("matmul", other) self.__wrapped__ @= other # not in wrapt # type: ignore return self def __itruediv__(self, other: Any) -> T: self.events.in_place("truediv", other) return super().__itruediv__(other) # type: ignore def __ifloordiv__(self, other: Any) -> T: self.events.in_place("floordiv", other) return super().__ifloordiv__(other) # type: ignore def __imod__(self, other: Any) -> T: self.events.in_place("mod", other) return super().__imod__(other) # type: ignore def __ipow__(self, other: Any) -> T: self.events.in_place("pow", other) return super().__ipow__(other) # type: ignore def __ilshift__(self, other: Any) -> T: self.events.in_place("lshift", other) return super().__ilshift__(other) # type: ignore def __irshift__(self, other: Any) -> T: self.events.in_place("rshift", other) return super().__irshift__(other) # type: ignore def __iand__(self, other: Any) -> T: self.events.in_place("and", other) return super().__iand__(other) # type: ignore def __ixor__(self, other: Any) -> T: self.events.in_place("xor", other) return super().__ixor__(other) # type: ignore def __ior__(self, other: Any) -> T: self.events.in_place("or", other) return super().__ior__(other) # type: ignore class EventedCallableObjectProxy(EventedObjectProxy): """Create a proxy of `target` that includes an `events` [psygnal.SignalGroup][]. `target` must be callable. !!! important This class requires `wrapt` to be installed. You can install directly (`pip install wrapt`) or by using the psygnal extra: `pip install psygnal[proxy]` Signals will be emitted whenever an attribute is set or deleted, or (if the object implements `__getitem__`) whenever an item is set or deleted. If the object supports in-place modification (i.e. any of the `__i{}__` magic methods), then an `in_place` event is emitted (with the name of the method) whenever any of them are used. Lastly, if the item is called, a `called` event is emitted with the (args, kwargs) used in the call. The events available at `target.events` include: - `attribute_set`: `Signal(str, object)` - `attribute_deleted`: `Signal(str)` - `item_set`: `Signal(object, object)` - `item_deleted`: `Signal(object)` - `in_place`: `Signal(str, object)` - `called`: `Signal(tuple, dict)` Parameters ---------- target : Callable An callable object to wrap """ def __init__(self, target: Callable): super().__init__(target) @property def events(self) -> CallableProxyEvents: # pragma: no cover # unclear why """`SignalGroup` containing events for this object proxy.""" obj_id = id(self) if obj_id not in _OBJ_CACHE: _OBJ_CACHE[obj_id] = CallableProxyEvents() finalize(self, partial(_OBJ_CACHE.pop, obj_id, None)) return _OBJ_CACHE[obj_id] # type: ignore def __call__(self, *args: Any, **kwargs: Any) -> Any: """Call the wrapped object and emit a `called` signal.""" self.events.called(args, kwargs) return self.__wrapped__(*args, **kwargs) psygnal-0.15.0/src/psygnal/containers/_evented_set.py0000644000000000000000000002714715073705675017661 0ustar00from __future__ import annotations import inspect from collections.abc import Iterable, Iterator, Mapping, MutableSet from itertools import chain from typing import ( TYPE_CHECKING, Any, ClassVar, Final, TypeVar, get_args, ) from psygnal import Signal, SignalGroup if TYPE_CHECKING: from pydantic import GetCoreSchemaHandler, SerializationInfo from typing_extensions import Self _T = TypeVar("_T") class BailType: pass BAIL: Final = BailType() class _BaseMutableSet(MutableSet[_T]): _data: set[_T] # pragma: no cover def __init__(self, iterable: Iterable[_T] = ()): self._data = set() self._data.update(iterable) def add(self, item: _T) -> None: """Add an element to a set. This has no effect if the element is already present. """ _item = self._pre_add_hook(item) if not isinstance(_item, BailType): self._do_add(_item) self._post_add_hook(_item) def update(self, *others: Iterable[_T]) -> None: """Update this set with the union of this set and others.""" for i in chain(*others): self.add(i) def discard(self, item: _T) -> None: """Remove an element from a set if it is a member. If the element is not a member, do nothing. """ _item = self._pre_discard_hook(item) if not isinstance(_item, BailType): self._do_discard(_item) self._post_discard_hook(_item) def clear(self) -> None: _item = self._pre_clear_hook() if not isinstance(_item, BailType): self._do_clear() self._post_clear_hook(_item) def __contains__(self, value: object) -> bool: """Return True if value is in set.""" return value in self._data def __iter__(self) -> Iterator[_T]: """Implement iter(self).""" return iter(self._data) def __len__(self) -> int: """Return len(self).""" return len(self._data) def __repr__(self) -> str: """Return repr(self).""" return f"{self.__class__.__name__}({self._data!r})" # -------- def _pre_add_hook(self, item: _T) -> _T | BailType: return item # pragma: no cover def _post_add_hook(self, item: _T) -> None: ... def _pre_discard_hook(self, item: _T) -> _T | BailType: return item # pragma: no cover def _post_discard_hook(self, item: _T) -> None: ... def _pre_clear_hook(self) -> tuple[_T, ...] | BailType: return tuple(self) # pragma: no cover def _post_clear_hook(self, item: tuple[_T, ...]) -> None: ... def _do_add(self, item: _T) -> None: self._data.add(item) def _do_discard(self, item: _T) -> None: self._data.discard(item) def _do_clear(self) -> None: self._data.clear() # -------- To match set API def __copy__(self) -> Self: return self.copy() def copy(self) -> Self: return self.__class__(self) def difference(self, *s: Iterable[_T]) -> Self: """Return the difference of two or more sets as a new set. (i.e. all elements that are in this set but not the others.) """ other = set(chain(*s)) return self.__class__(i for i in self if i not in other) def difference_update(self, *s: Iterable[_T]) -> None: """Remove all elements of another set from this set.""" for i in chain(*s): self.discard(i) def intersection(self, *s: Iterable[_T]) -> Self: """Return the intersection of two sets as a new set. (i.e. all elements that are in both sets.) """ other = set.intersection(*(set(x) for x in s)) return self.__class__(i for i in self if i in other) def intersection_update(self, *s: Iterable[_T]) -> None: """Update this set with the intersection of itself and another.""" other = set.intersection(*(set(x) for x in s)) for i in tuple(self): if i not in other: self.discard(i) def issubset(self, __s: Iterable[Any]) -> bool: """Report whether another set contains this set.""" return set(self).issubset(__s) def issuperset(self, __s: Iterable[Any]) -> bool: """Report whether this set contains another set.""" return set(self).issuperset(__s) def symmetric_difference(self, __s: Iterable[_T]) -> Self: """Return the symmetric difference of two sets as a new set. (i.e. all elements that are in exactly one of the sets.) """ a = chain((i for i in __s if i not in self), (i for i in self if i not in __s)) return self.__class__(a) def symmetric_difference_update(self, __s: Iterable[_T]) -> None: """Update this set with the symmetric difference of itself and another. This will remove any items in this set that are also in `other`, and add any items in others that are not present in this set. """ for i in __s: self.discard(i) if i in self else self.add(i) def union(self, *s: Iterable[_T]) -> Self: """Return the union of sets as a new set. (i.e. all elements that are in either set.) """ new = self.copy() new.update(*s) return new # PYDANTIC SUPPORT @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> Mapping[str, Any]: """Return the Pydantic core schema for this object.""" from pydantic_core import core_schema def _serialize(obj: _BaseMutableSet[_T], info: SerializationInfo, /) -> Any: if info.mode_is_json(): return obj._data return cls(obj._data) # get item type item_type = args[0] if (args := get_args(source_type)) else Any items_schema = handler.generate_schema(item_type) set_schema = core_schema.set_schema(items_schema=items_schema) return core_schema.no_info_after_validator_function( function=cls, json_schema_input_schema=set_schema, schema=set_schema, serialization=core_schema.plain_serializer_function_ser_schema( _serialize, info_arg=True, ), ) class OrderedSet(_BaseMutableSet[_T]): """A set that preserves insertion order, uses dict behind the scenes.""" _data: dict[_T, None] # type: ignore # pragma: no cover def __init__(self, iterable: Iterable[_T] = ()): self._data = {} self.update(iterable) def _do_add(self, item: _T) -> None: self._data[item] = None def _do_discard(self, item: _T) -> None: self._data.pop(item, None) def __repr__(self) -> str: """Return repr(self).""" inner = ", ".join(str(x) for x in self._data) return f"{self.__class__.__name__}(({inner}))" class SetEvents(SignalGroup): """Events available on [EventedSet][psygnal.containers.EventedSet]. Attributes ---------- items_changed (added: Tuple[Any, ...], removed: Tuple[Any, ...]) A signal that will emitted whenever an item or items are added or removed. Connected callbacks will be called with `callback(added, removed)`, where `added` and `removed` are tuples containing the objects that have been added or removed from the set. """ items_changed = Signal(tuple, tuple, reemission="queued") class EventedSet(_BaseMutableSet[_T]): """A set with an `items_changed` signal that emits when items are added/removed. Parameters ---------- iterable : Iterable[_T] Data to populate the set. If omitted, an empty set is created. Attributes ---------- events : SetEvents SignalGroup that with events related to set mutation. (see SetEvents) Examples -------- >>> from psygnal.containers import EventedSet >>> >>> my_set = EventedSet([1, 2, 3]) >>> my_set.events.items_changed.connect( >>> lambda a, r: print(f"added={a}, removed={r}") >>> ) >>> my_set.update({3, 4, 5}) added=(4, 5), removed=() Multi-item events will be reduced into a single emission: >>> my_set.symmetric_difference_update({4, 5, 6, 7}) added=(6, 7), removed=(4, 5) >>> my_set EventedSet({1, 2, 3, 6, 7}) """ events: SetEvents # pragma: no cover _psygnal_group_: ClassVar[str] = "events" def __init__(self, iterable: Iterable[_T] = ()): self.events = self._get_events_class() super().__init__(iterable) def update(self, *others: Iterable[_T]) -> None: """Update this set with the union of this set and others.""" with self.events.items_changed.paused(_reduce_events): super().update(*others) def clear(self) -> None: """Remove all elements from this set.""" with self.events.items_changed.paused(_reduce_events): super().clear() def difference_update(self, *s: Iterable[_T]) -> None: """Remove all elements of another set from this set.""" with self.events.items_changed.paused(_reduce_events): super().difference_update(*s) def intersection_update(self, *s: Iterable[_T]) -> None: """Update this set with the intersection of itself and another.""" with self.events.items_changed.paused(_reduce_events): super().intersection_update(*s) def symmetric_difference_update(self, __s: Iterable[_T]) -> None: """Update this set with the symmetric difference of itself and another. This will remove any items in this set that are also in `other`, and add any items in others that are not present in this set. """ with self.events.items_changed.paused(_reduce_events, ((), ())): super().symmetric_difference_update(__s) def _pre_add_hook(self, item: _T) -> _T | BailType: return BAIL if item in self else item def _post_add_hook(self, item: _T) -> None: self._emit_change((item,), ()) def _pre_discard_hook(self, item: _T) -> _T | BailType: return BAIL if item not in self else item def _post_discard_hook(self, item: _T) -> None: self._emit_change((), (item,)) def _pre_clear_hook(self) -> tuple[_T, ...] | BailType: return BAIL if len(self) == 0 else tuple(self) def _post_clear_hook(self, item: tuple[_T, ...]) -> None: self._emit_change((), item) def _emit_change(self, added: tuple[_T, ...], removed: tuple[_T, ...]) -> None: """Emit a change event.""" self.events.items_changed.emit(added, removed) def _get_events_class(self) -> SetEvents: return SetEvents() class EventedOrderedSet(EventedSet, OrderedSet[_T]): """A ordered variant of EventedSet that maintains insertion order. Parameters ---------- iterable : Iterable[_T] Data to populate the set. If omitted, an empty set is created. Attributes ---------- events : SetEvents SignalGroup that with events related to set mutation. (see SetEvents) """ # reproducing init here to avoid a mkdocs warning: # "Parameter 'iterable' does not appear in the function signature" def __init__(self, iterable: Iterable[_T] = ()): super().__init__(iterable) def _reduce_events(li: Iterable[tuple[Iterable, Iterable]]) -> tuple[tuple, tuple]: """Combine multiple events into a single event.""" added_li: list = [] removed_li: list = [] for added, removed in li: added_li.extend(added) removed_li.extend(removed) return tuple(added_li), tuple(removed_li) # for performance reasons _reduce_events.__signature__ = inspect.signature(_reduce_events) # type: ignore [attr-defined] psygnal-0.15.0/src/psygnal/containers/_selectable_evented_list.py0000644000000000000000000001020015073705675022202 0ustar00"""MutableSequence with a selection model.""" from collections.abc import Iterable from typing import Any, TypeVar from ._evented_list import EventedList, ListEvents from ._selection import Selectable _T = TypeVar("_T") class SelectableEventedList(Selectable[_T], EventedList[_T]): """`EventedList` subclass with a built in selection model. In addition to all `EventedList` properties, this class also has a `selection` attribute that manages a set of selected items in the list. Parameters ---------- data : iterable, optional Elements to initialize the list with. hashable : bool Whether the list should be hashable as id(self). By default `True`. child_events: bool Whether to re-emit events from emitted from evented items in the list (i.e. items that have SignalInstances). If `True`, child events can be connected at `EventedList.events.child_event`. By default, `False`. Attributes ---------- events : ListEvents SignalGroup that with events related to list mutation. (see ListEvents) selection : Selection An evented set containing the currently selected items, along with an `active` and `current` item. (See `Selection`) """ events: ListEvents # pragma: no cover def __init__( self, data: Iterable[_T] = (), *, hashable: bool = True, child_events: bool = False, ): self._activate_on_insert: bool = True super().__init__(data=data, hashable=hashable, child_events=child_events) self.events.removed.connect(self._on_item_removed) def _on_item_removed(self, idx: int, obj: Any) -> None: self.selection.discard(obj) def insert(self, index: int, value: _T) -> None: """Insert item(s) into the list and update the selection.""" super().insert(index, value) if self._activate_on_insert: self.selection.active = value def select_all(self) -> None: """Select all items in the list.""" self.selection.update(self) def deselect_all(self) -> None: """Deselect all items in the list.""" self.selection.clear() def select_next( self, step: int = 1, expand_selection: bool = False, wraparound: bool = False ) -> None: """Select the next item in the list. Parameters ---------- step : int The step size to take when picking the next item, by default 1 expand_selection : bool If True, will expand the selection to contain the both the current item and the next item, by default False wraparound : bool Whether to return to the beginning of the list of the end has been reached, by default False """ if len(self) == 0: return elif not self.selection: idx = -1 if step > 0 else 0 else: idx = self.index(self.selection._current) + step idx_in_sequence = len(self) > idx >= 0 if wraparound: idx = idx % len(self) elif not idx_in_sequence: idx = -1 if step > 0 else 0 next_item = self[idx] if expand_selection: self.selection.add(next_item) self.selection._current = next_item else: self.selection.active = next_item def select_previous( self, expand_selection: bool = False, wraparound: bool = False ) -> None: """Select the previous item in the list.""" self.select_next( step=-1, expand_selection=expand_selection, wraparound=wraparound ) def remove_selected(self) -> tuple[_T, ...]: """Remove selected items from the list and the selection. Returns ------- Tuple[_T, ...] The items that were removed. """ selected_items = tuple(self.selection) idx = 0 for item in list(self.selection): idx = self.index(item) self.remove(item) new_idx = max(0, idx - 1) if len(self) > new_idx: self.selection.add(self[new_idx]) return selected_items psygnal-0.15.0/src/psygnal/containers/_selection.py0000644000000000000000000001570215073705675017333 0ustar00from __future__ import annotations from collections.abc import Container from typing import TYPE_CHECKING, Any, TypeVar from psygnal._signal import Signal from ._evented_set import BailType, EventedOrderedSet, SetEvents, _reduce_events if TYPE_CHECKING: from collections.abc import Iterable _T = TypeVar("_T") _S = TypeVar("_S") class SelectionEvents(SetEvents): """Events available on [Selection][psygnal.containers.Selection]. Attributes ---------- items_changed (added: Tuple[_T], removed: Tuple[_T]) A signal that will emitted whenever an item or items are added or removed. Connected callbacks will be called with `callback(added, removed)`, where `added` and `removed` are tuples containing the objects that have been added or removed from the set. active (value: _T) Emitted when the active item has changed. An active item is a single selected item. _current (value: _T) Emitted when the current item has changed. (Private event) """ active = Signal(object) _current = Signal(object) class Selection(EventedOrderedSet[_T]): """An model of selected items, with a `active` and `current` item. There can only be one `active` and one `current` item, but there can be multiple selected items. An "active" item is defined as a single selected item (if multiple items are selected, there is no active item). The "current" item is mostly useful for (e.g.) keyboard actions: even with multiple items selected, you may only have one current item, and keyboard events (like up and down) can modify that current item. It's possible to have a current item without an active item, but an active item will always be the current item. An item can be the current item and selected at the same time. Qt views will ensure that there is always a current item as keyboard navigation, for example, requires a current item. This pattern mimics current/selected items from Qt: https://doc.qt.io/qt-5/model-view-programming.html#current-item-and-selected-items Parameters ---------- data : iterable, optional Elements to initialize the set with. parent : Container, optional The parent container, if any. This is used to provide validation upon mutation in common use cases. Attributes ---------- events : SelectionEvents SignalGroup that with events related to selection changes. (see SelectionEvents) active : Any, optional The active item, if any. An "active" item is defined as a single selected item (if multiple items are selected, there is no active item) _current : Any, optional The current item, if any. This is used primarily by GUI views when handling mouse/key events. """ events: SelectionEvents # pragma: no cover def __init__(self, data: Iterable[_T] = (), parent: Container | None = None): self._active: _T | None = None self._current_: _T | None = None self._parent: Container | None = parent super().__init__(iterable=data) self._update_active() @property def _current(self) -> _T | None: # pragma: no cover """Get current item.""" return self._current_ @_current.setter def _current(self, value: _T | None) -> None: # pragma: no cover """Set current item.""" if value == self._current_: return self._current_ = value self.events._current.emit(value) @property def active(self) -> _T | None: # pragma: no cover """Return the currently active item or None.""" return self._active @active.setter def active(self, value: _T | None) -> None: # pragma: no cover """Set the active item. This makes `value` the only selected item, and makes it current. """ if value == self._active: return self._active = value self.clear() if value is None else self.select_only(value) self._current = value self.events.active.emit(value) def clear(self, keep_current: bool = False) -> None: """Clear the selection. Parameters ---------- keep_current : bool If `False` (the default), the "current" item will also be set to None. """ if not keep_current: self._current = None super().clear() def toggle(self, obj: _T) -> None: """Toggle selection state of obj.""" self.symmetric_difference_update({obj}) def select_only(self, obj: _T) -> None: """Unselect everything but `obj`. Add to selection if not currently selected.""" with self.events.items_changed.paused(_reduce_events): self.intersection_update({obj}) self.add(obj) def replace_selection(self, new_selection: Iterable[_T]) -> None: """Replace the current selection with `new_selection`. This is equivalent to calling `intersection_update` followed by `update`, but is more efficient because it only emits a single `items_changed` event. """ with self.events.items_changed.paused(_reduce_events): self.intersection_update(new_selection) self.update(new_selection) def _update_active(self) -> None: """On a selection event, update the active item based on selection. An active item is a single selected item. """ if len(self) == 1: self.active = next(iter(self)) elif self._active is not None: self._active = None self.events.active.emit(None) def _get_events_class(self) -> SelectionEvents: """Override SetEvents with SelectionEvents.""" return SelectionEvents() def _emit_change(self, added: tuple[_T, ...], removed: tuple[_T, ...]) -> None: """Emit a change event.""" super()._emit_change(added, removed) self._update_active() def _pre_add_hook(self, item: _T) -> _T | BailType: if self._parent is not None and item not in self._parent: raise ValueError( "Cannot select an item that is not in the parent container." ) return super()._pre_add_hook(item) def __hash__(self) -> int: """Make selection hashable.""" return id(self) class Selectable(Container[_S]): """Mixin that adds a selection model to a container.""" def __init__(self, *args: Any, **kwargs: Any) -> None: self._selection: Selection[_S] = Selection(parent=self) super().__init__(*args, **kwargs) @property def selection(self) -> Selection[_S]: # pragma: no cover """Get current selection.""" return self._selection @selection.setter def selection(self, new_selection: Iterable[_S]) -> None: # pragma: no cover """Set selection, without deleting selection model object.""" self._selection.intersection_update(new_selection) self._selection.update(new_selection) psygnal-0.15.0/tests/__init__.py0000644000000000000000000000021415073705675013467 0ustar00# don't love having this file here, # but it's the only way I've found to target the tests directory # in the mypy config in pyproject.toml psygnal-0.15.0/tests/test_bench.py0000644000000000000000000001450415073705675014055 0ustar00import sys from collections.abc import Callable from dataclasses import dataclass from functools import partial from inspect import signature from typing import ClassVar from unittest.mock import Mock import pytest from psygnal import EmissionInfo, Signal, SignalGroupDescriptor, SignalInstance, evented from psygnal._group import PathStep if all(x not in {"--codspeed", "--benchmark", "tests/test_bench.py"} for x in sys.argv): pytest.skip("use --benchmark to run benchmark", allow_module_level=True) CALLBACK_TYPES = [ "function", "method", "lambda", "partial", "partial_method", "setattr", "setitem", "real_func", "print", ] # fmt: off class Emitter: one_int = Signal(int) int_str = Signal(int, str) class Obj: x: int = 0 def __setitem__(self, key: str, value: int) -> None: self.x = value def no_args(self) -> None: ... def one_int(self, x: int) -> None: ... def int_str(self, x: int, y: str) -> None: ... def no_args() -> None: ... def one_int(x: int) -> None: ... def int_str(x: int, y: str) -> None: ... def real_func() -> None: list(range(4)) # simulate a brief thing INT_SIG = signature(one_int) # fmt: on def _get_callback(callback_type: str, obj: Obj) -> Callable: callback_types: dict[str, Callable] = { "function": one_int, "method": obj.one_int, "lambda": lambda x: None, "partial": partial(int_str, y="foo"), "partial_method": partial(obj.int_str, y="foo"), "real_func": real_func, "print": print, } return callback_types[callback_type] # Creation suite ------------------------------------------ def test_create_signal(benchmark: Callable) -> None: benchmark(Signal, int) def test_create_signal_instance(benchmark: Callable) -> None: benchmark(SignalInstance, INT_SIG) # Connect suite --------------------------------------------- @pytest.mark.parametrize("check_types", ["check_types", ""]) @pytest.mark.parametrize("callback_type", CALLBACK_TYPES) def test_connect_time( benchmark: Callable, callback_type: str, check_types: str ) -> None: emitter = Emitter() obj = Obj() kwargs = {} if callback_type == "setattr": func: Callable = emitter.one_int.connect_setattr args: tuple = (obj, "x") kwargs = {"maxargs": 1} elif callback_type == "setitem": func = emitter.one_int.connect_setitem args = (obj, "x") kwargs = {"maxargs": 1} else: func = emitter.one_int.connect args = (_get_callback(callback_type, obj),) kwargs = {"check_types": bool(check_types)} benchmark(func, *args, **kwargs) # Emit suite ------------------------------------------------ @pytest.mark.parametrize("n_connections", range(2, 2**6, 16)) @pytest.mark.parametrize("callback_type", CALLBACK_TYPES) def test_emit_time(benchmark: Callable, n_connections: int, callback_type: str) -> None: emitter = Emitter() obj = Obj() if callback_type == "setattr": for _ in range(n_connections): emitter.one_int.connect_setattr(obj, "x", maxargs=1) elif callback_type == "setitem": for _ in range(n_connections): emitter.one_int.connect_setitem(obj, "x", maxargs=1) else: callback = _get_callback(callback_type, obj) for _ in range(n_connections): emitter.one_int.connect(callback, unique=False) benchmark(emitter.one_int.emit, 1) def test_emit_fast(benchmark: Callable) -> None: emitter = Emitter() emitter.one_int.connect(one_int) benchmark(emitter.one_int.emit_fast, 1) @pytest.mark.benchmark def test_evented_creation() -> None: @evented @dataclass class Obj: x: int = 0 y: str = "hi" z: bool = False _ = Obj().events # type: ignore def test_evented_setattr(benchmark: Callable) -> None: @evented @dataclass class Obj: x: int = 0 y: str = "hi" z: bool = False obj = Obj() _ = obj.events # type: ignore benchmark(setattr, obj, "x", 1) def _get_dataclass(type_: str) -> type: if type_ == "attrs": from attrs import define @define class Foo: a: int b: str c: bool d: float e: tuple[int, str] events: ClassVar = SignalGroupDescriptor() elif type_ == "dataclass": @dataclass class Foo: # type: ignore [no-redef] a: int b: str c: bool d: float e: tuple[int, str] events: ClassVar = SignalGroupDescriptor() elif type_ == "msgspec": import msgspec class Foo(msgspec.Struct): # type: ignore [no-redef] a: int b: str c: bool d: float e: tuple[int, str] events: ClassVar = SignalGroupDescriptor() elif type_ == "pydantic": from pydantic import BaseModel class Foo(BaseModel): # type: ignore [no-redef] a: int b: str c: bool d: float e: tuple[int, str] events: ClassVar = SignalGroupDescriptor() return Foo @pytest.mark.parametrize("type_", ["dataclass", "pydantic", "attrs", "msgspec"]) def test_dataclass_group_create(type_: str, benchmark: Callable) -> None: if type_ == "msgspec": pytest.importorskip("msgspec") Foo = _get_dataclass(type_) foo = Foo(a=1, b="hi", c=True, d=1.0, e=(1, "hi")) benchmark(getattr, foo, "events") @pytest.mark.parametrize("type_", ["dataclass", "pydantic", "attrs", "msgspec"]) def test_dataclass_setattr(type_: str, benchmark: Callable) -> None: if type_ == "msgspec": pytest.importorskip("msgspec") Foo = _get_dataclass(type_) foo = Foo(a=1, b="hi", c=True, d=1.0, e=(1, "hi")) mock = Mock() foo.events._psygnal_relay.connect(mock) def _doit() -> None: foo.a = 2 foo.b = "hello" foo.c = False foo.d = 2.0 foo.e = (2, "hello") benchmark(_doit) for emitted, attr in zip( [(2, 1), ("hello", "hi"), (False, True), (2.0, 1.0), ((2, "hello"), (1, "hi"))], "abcde", strict=False, ): mock.assert_any_call( EmissionInfo(getattr(foo.events, attr), emitted, (PathStep(attr=attr),)) ) assert getattr(foo, attr) == emitted[0] psygnal-0.15.0/tests/test_coroutines.py0000644000000000000000000003462115073705675015172 0ustar00from __future__ import annotations import asyncio import gc import importlib.util import signal from typing import TYPE_CHECKING, Any, Literal, Protocol from unittest.mock import Mock import pytest import pytest_asyncio from psygnal import _async from psygnal._weak_callback import WeakCallback, weak_callback if TYPE_CHECKING: from collections.abc import Callable, Iterator # Available backends for parametrization AVAILABLE_BACKENDS = ["asyncio"] if importlib.util.find_spec("trio") is not None: AVAILABLE_BACKENDS.append("trio") if importlib.util.find_spec("anyio") is not None: AVAILABLE_BACKENDS.append("anyio") class BackendTestRunner(Protocol): """Protocol for backend-specific test runners.""" @property def backend_name(self) -> Literal["asyncio", "anyio", "trio"]: """Name of the backend being used.""" ... async def sleep(self, duration: float) -> None: """Sleep for the given duration using backend-specific sleep.""" ... def run_with_backend(self, test_func: Callable[[], Any]) -> Any: """Run a test function with proper backend setup and teardown. Synchronous.""" ... class AsyncioTestRunner: """Test runner for asyncio backend.""" @property def backend_name(self) -> Literal["asyncio"]: return "asyncio" async def sleep(self, duration: float) -> None: await asyncio.sleep(duration) def run_with_backend(self, test_func: Callable[[], Any]) -> Any: """Run test with asyncio backend.""" async def _run_test() -> Any: _async.clear_async_backend() backend = _async.set_async_backend("asyncio") # Wait for backend to be running await self._wait_for_backend_running(backend) try: return await test_func() finally: # Cleanup if hasattr(backend, "_task") and not backend._task.done(): backend._task.cancel() try: await backend._task except asyncio.CancelledError: pass _async.clear_async_backend() return asyncio.run(_run_test()) async def _wait_for_backend_running( self, backend: _async._AsyncBackend, timeout: float = 1.0 ) -> None: """Wait for backend to be running with a timeout.""" start_time = asyncio.get_event_loop().time() while not backend.running.is_set(): if asyncio.get_event_loop().time() - start_time > timeout: raise TimeoutError("Backend did not start running within timeout") await asyncio.sleep(0) class AnyioTestRunner: """Test runner for anyio backend.""" @property def backend_name(self) -> Literal["anyio"]: return "anyio" async def sleep(self, duration: float) -> None: import anyio await anyio.sleep(duration) def run_with_backend(self, test_func: Callable[[], Any]) -> Any: """Run test with anyio backend using structured concurrency.""" import anyio async def _run_test(): _async.clear_async_backend() backend = _async.set_async_backend("anyio") result = None async with anyio.create_task_group() as tg: tg.start_soon(backend.run) # Wait for backend to be running await backend.running.wait() try: result = await test_func() finally: # Cancel task group to shutdown properly tg.cancel_scope.cancel() _async.clear_async_backend() return result return anyio.run(_run_test) class TrioTestRunner: """Test runner for trio backend.""" @property def backend_name(self) -> Literal["trio"]: return "trio" async def sleep(self, duration: float) -> None: import trio await trio.sleep(duration) def run_with_backend(self, test_func: Callable[[], Any]) -> Any: """Run test with trio backend using structured concurrency.""" # On Windows asyncio has probably left its FD installed try: signal.set_wakeup_fd(-1) # restore default except (ValueError, AttributeError): pass # not the main thread or not supported import trio result = None async def _trio_main(): nonlocal result _async.clear_async_backend() backend = _async.set_async_backend("trio") # Use a timeout to prevent hanging with trio.move_on_after(5.0) as cancel_scope: async with trio.open_nursery() as nursery: nursery.start_soon(backend.run) # Wait for backend to be running await backend.running.wait() try: result = await test_func() finally: # Cancel nursery to shutdown properly nursery.cancel_scope.cancel() # Check if we timed out if cancel_scope.cancelled_caught: raise TimeoutError("Test timed out") _async.clear_async_backend() # Run in trio context trio.run(_trio_main) return result async def mock_call_count( mock: Mock, runner: BackendTestRunner, max_iterations: int = 100 ) -> None: """Wait for callback execution with backend-specific sleep.""" for _ in range(max_iterations): await runner.sleep(0.01) if mock.call_count > 0: break @pytest_asyncio.fixture async def clean_async_backend(): """Fixture to ensure clean async backend state.""" _async.clear_async_backend() yield _async.clear_async_backend() @pytest.fixture(params=AVAILABLE_BACKENDS) def runner( request: pytest.FixtureRequest, clean_async_backend: None ) -> Iterator[BackendTestRunner]: """Get the backend runner for the specified backend.""" mapping: dict[str, type[BackendTestRunner]] = { "asyncio": AsyncioTestRunner, "anyio": AnyioTestRunner, "trio": TrioTestRunner, } yield mapping[request.param]() # Parametrized tests for all backends @pytest.mark.parametrize( "slot_type", [ "coroutinefunc", "weak_coroutinefunc", "coroutinemethod", ], ) def test_slot_types_all_backends(runner: BackendTestRunner, slot_type: str) -> None: """Test async slot types with all available backends.""" async def _test_slot_type(): mock = Mock() final_mock = Mock() if slot_type in {"coroutinefunc", "weak_coroutinefunc"}: async def test_obj(x: int) -> int: mock(x) return x cb = weak_callback( test_obj, strong_func=(slot_type == "coroutinefunc"), finalize=final_mock, ) elif slot_type == "coroutinemethod": class MyObj: async def coroutine_method(self, x: int) -> int: mock(x) return x obj = MyObj() cb = weak_callback(obj.coroutine_method, finalize=final_mock) assert isinstance(cb, WeakCallback) assert isinstance(cb.slot_repr(), str) assert cb.dereference() is not None # Test callback execution cb.cb((2,)) await mock_call_count(mock, runner) mock.assert_called_once_with(2) # Test direct await mock.reset_mock() result = await cb(4) assert result == 4 mock.assert_called_once_with(4) # Test weak reference cleanup if slot_type in {"coroutinefunc", "weak_coroutinefunc"}: del test_obj else: del obj gc.collect() if slot_type == "coroutinefunc": # strong_func cb.cb((4,)) await mock_call_count(mock, runner) mock.assert_called_with(4) else: await mock_call_count(final_mock, runner) final_mock.assert_called_once_with(cb) assert cb.dereference() is None with pytest.raises(ReferenceError): cb.cb((2,)) with pytest.raises(ReferenceError): await cb(2) # Run the test with the appropriate backend runner.run_with_backend(_test_slot_type) def test_backend_error_conditions(runner: BackendTestRunner) -> None: """Test backend error conditions and exception handling.""" async def _test_errors(): mock = Mock() async def test_coro(x: int) -> int: if x == 999: raise ValueError("Test error") mock(x) return x cb = weak_callback(test_coro, strong_func=True) # Test normal execution cb.cb((5,)) await mock_call_count(mock, runner) mock.assert_called_once_with(5) # Test error case - should not crash the backend cb.cb((999,)) await runner.sleep(0.1) # Give time for error to be handled # Backend should still work after error mock.reset_mock() cb.cb((10,)) await mock_call_count(mock, runner) mock.assert_called_once_with(10) # Run the test with the backend runner runner.run_with_backend(_test_errors) @pytest.mark.usefixtures("clean_async_backend") @pytest.mark.asyncio async def test_run_method_early_return() -> None: """Test that run() method returns early if backend is already running.""" backend = _async.set_async_backend("asyncio") # Wait for backend to be running start_time = asyncio.get_event_loop().time() while not backend.running.is_set(): if asyncio.get_event_loop().time() - start_time > 1.0: raise TimeoutError("Backend did not start running within timeout") await asyncio.sleep(0) # Now calling run() again should return early await backend.run() # Backend should still be running assert backend.running.is_set() @pytest.mark.parametrize("backend_name", AVAILABLE_BACKENDS) def test_high_level_api(backend_name: Literal["trio", "asyncio", "anyio"]) -> None: """Test the exact usage pattern shown in the feature summary documentation.""" def run_example() -> None: """The example from the feature summary, adapted for testing.""" async def example_main() -> None: # Step 1: Set Backend Early (Once Per Application) backend = _async.set_async_backend(backend_name) # Step 2: Launch Backend in Your Event Loop (backend-specific) if backend_name == "asyncio": import asyncio # Start the backend as a background task async_backend = _async.get_async_backend() assert async_backend is not None task = asyncio.create_task(async_backend.run()) # Wait for backend to be running await backend.running.wait() elif backend_name == "anyio": import anyio async with anyio.create_task_group() as tg: # Start the backend in the task group async_backend = _async.get_async_backend() assert async_backend is not None tg.start_soon(async_backend.run) # Wait for backend to be running await backend.running.wait() # Run the actual example await run_signal_example() # Cancel to exit cleanly tg.cancel_scope.cancel() return elif backend_name == "trio": import trio async with trio.open_nursery() as nursery: # Start the backend in the nursery async_backend = _async.get_async_backend() assert async_backend is not None nursery.start_soon(async_backend.run) # Wait for backend to be running await backend.running.wait() # Run the actual example await run_signal_example() # Cancel to exit cleanly nursery.cancel_scope.cancel() return # For asyncio, run the example after backend is started try: await run_signal_example() finally: if backend_name == "asyncio": task.cancel() try: await task except asyncio.CancelledError: pass async def run_signal_example() -> None: """Step 3: Connect Async Callbacks - the exact example from docs.""" from psygnal import Signal class MyObj: value_changed = Signal(str) def set_value(self, value: str) -> None: self.value_changed.emit(value) # Track calls for testing mock = Mock() async def on_value_changed(new_value: str) -> None: mock(new_value) obj = MyObj() obj.value_changed.connect(on_value_changed) obj.set_value("hello!") # Wait for callback to execute max_wait = 100 for _ in range(max_wait): if mock.call_count > 0: break if backend_name == "asyncio": await asyncio.sleep(0.01) elif backend_name == "anyio": import anyio await anyio.sleep(0.01) elif backend_name == "trio": import trio await trio.sleep(0.01) # Verify the callback was called with the expected value assert mock.call_count == 1 assert mock.call_args[0][0] == "hello!" # Run the example with the appropriate backend if backend_name == "asyncio": return asyncio.run(example_main()) elif backend_name == "anyio": import anyio return anyio.run(example_main) elif backend_name == "trio": import trio return trio.run(example_main) # Clear any existing backend before test _async.clear_async_backend() try: run_example() finally: # Clean up after test _async.clear_async_backend() psygnal-0.15.0/tests/test_dataclass_utils.py0000644000000000000000000000374215073705675016157 0ustar00from dataclasses import dataclass import pytest from attr import define from psygnal import _dataclass_utils try: from msgspec import Struct except (ImportError, TypeError): # type error on python 3.12-dev Struct = None # type: ignore [assignment,misc] try: from pydantic import __version__ as pydantic_version PYDANTIC2 = pydantic_version.startswith("2.") except ImportError: PYDANTIC2 = False VARIANTS = ["dataclass", "attrs_class", "pydantic_model"] if Struct is not None: VARIANTS.append("msgspec_struct") @pytest.mark.parametrize("frozen", [True, False], ids=["frozen", ""]) @pytest.mark.parametrize("type_", VARIANTS) def test_dataclass_utils(type_: str, frozen: bool) -> None: if type_ == "attrs_class": @define(frozen=frozen) # type: ignore class Foo: x: int y: str = "foo" elif type_ == "dataclass": @dataclass(frozen=frozen) # type: ignore class Foo: # type: ignore [no-redef] x: int y: str = "foo" elif type_ == "msgspec_struct": class Foo(Struct, frozen=frozen): # type: ignore [no-redef] x: int y: str = "foo" elif type_ == "pydantic_model": pytest.importorskip("pydantic") from pydantic import BaseModel class Foo(BaseModel): # type: ignore [no-redef] x: int y: str = "foo" if PYDANTIC2: model_config = {"frozen": frozen} else: class Config: allow_mutation = not frozen for name in VARIANTS: is_type = getattr(_dataclass_utils, f"is_{name}") assert is_type(Foo) is (name == type_) assert is_type(Foo(x=1)) is (name == type_) assert list(_dataclass_utils.iter_fields(Foo)) == [("x", int), ("y", str)] if type_ == "msgspec_struct" and frozen: # not supported until next release of msgspec return assert _dataclass_utils.is_frozen(Foo) == frozen psygnal-0.15.0/tests/test_evented_decorator.py0000644000000000000000000003135115073705675016471 0ustar00import operator from dataclasses import dataclass, field from typing import TYPE_CHECKING, ClassVar, cast, no_type_check from unittest.mock import Mock import numpy as np import pytest from psygnal import ( EmissionInfo, PathStep, Signal, SignalGroup, SignalGroupDescriptor, SignalInstance, evented, get_evented_namespace, is_evented, testing, ) from psygnal._group import SignalRelay try: import pydantic.version PYDANTIC_V2 = pydantic.version.VERSION.startswith("2") except ImportError: PYDANTIC_V2 = False decorated_or_descriptor = pytest.mark.parametrize( "decorator", [True, False], ids=["decorator", "descriptor"] ) @no_type_check def _check_events(cls, events_ns="events"): obj = cls(bar=1, baz="2", qux=np.zeros(3)) assert is_evented(obj) assert is_evented(cls) assert get_evented_namespace(cls) == events_ns assert isinstance(getattr(cls, events_ns), SignalGroupDescriptor) events = getattr(obj, events_ns) assert isinstance(events, SignalGroup) assert set(events) == {"bar", "baz", "qux"} mock = Mock() events.bar.connect(mock) assert obj.bar == 1 obj.bar = 2 assert obj.bar == 2 mock.assert_called_once_with(2, 1) mock.reset_mock() obj.baz = "3" mock.assert_not_called() mock.reset_mock() events.qux.connect(mock) obj.qux = np.ones(3) mock.assert_called_once() assert np.array_equal(obj.qux, np.ones(3)) @decorated_or_descriptor @pytest.mark.parametrize("kwargs", ({"slots": True}, {"slots": False})) def test_native_dataclass(decorator: bool, kwargs: dict) -> None: @dataclass(**kwargs) class Base: bar: int baz: str qux: np.ndarray if decorator: @evented(equality_operators={"qux": operator.eq}) # just for test coverage class Foo(Base): ... else: class Foo(Base): # type: ignore [no-redef] events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor( equality_operators={"qux": operator.eq} ) _check_events(Foo) @decorated_or_descriptor @pytest.mark.parametrize("slots", [True, False]) def test_attrs_dataclass(decorator: bool, slots: bool) -> None: from attrs import define @define(slots=slots) # type: ignore [misc] class Base: bar: int baz: str qux: np.ndarray if decorator: @evented class Foo(Base): ... else: class Foo(Base): # type: ignore [no-redef] events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor() _check_events(Foo) if PYDANTIC_V2: Config = {"arbitrary_types_allowed": True} else: class Config: arbitrary_types_allowed = True @decorated_or_descriptor def test_pydantic_dataclass(decorator: bool) -> None: pytest.importorskip("pydantic") from pydantic.dataclasses import dataclass @dataclass(config=Config) class Base: bar: int baz: str qux: np.ndarray if decorator: @evented class Foo(Base): ... else: class Foo(Base): # type: ignore [no-redef] events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor() _check_events(Foo) @decorated_or_descriptor def test_pydantic_base_model(decorator: bool) -> None: pytest.importorskip("pydantic") from pydantic import BaseModel class Base(BaseModel): bar: int baz: str qux: np.ndarray if PYDANTIC_V2: model_config = Config else: Config = Config # type: ignore if decorator: @evented(events_namespace="my_events") class Foo(Base): ... else: class Foo(Base): # type: ignore [no-redef] my_events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor() _check_events(Foo, "my_events") @pytest.mark.parametrize("decorator", [True, False], ids=["decorator", "descriptor"]) def test_msgspec_struct(decorator: bool) -> None: if TYPE_CHECKING: import msgspec else: msgspec = pytest.importorskip("msgspec") # remove when py37 is dropped if decorator: @evented class Foo(msgspec.Struct): bar: int baz: str qux: np.ndarray else: class Foo(msgspec.Struct): # type: ignore [no-redef] bar: int baz: str qux: np.ndarray events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor() _check_events(Foo) def test_no_signals_warn() -> None: with pytest.warns(UserWarning, match="No mutable fields found on class"): @evented class Foo: ... _ = Foo().events # type: ignore with pytest.warns(UserWarning, match="No mutable fields found on class"): class Foo2: events = SignalGroupDescriptor() _ = Foo2().events @dataclass class Foo3: events = SignalGroupDescriptor(warn_on_no_fields=False) # no warning _ = Foo3().events @dataclass class FooPicklable: bar: int events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor( cache_on_instance=False ) def test_pickle() -> None: """Make sure that evented classes are still picklable.""" import pickle obj = FooPicklable(1) obj2 = pickle.loads(pickle.dumps(obj)) assert obj2.bar == 1 def test_get_namespace() -> None: @evented(events_namespace="my_events") @dataclass class Foo: x: int assert get_evented_namespace(Foo) == "my_events" assert is_evented(Foo) def test_name_conflicts() -> None: # https://github.com/pyapp-kit/psygnal/pull/269 from dataclasses import field @evented @dataclass class Foo: name: str all: bool = False is_uniform: bool = True signals: list = field(default_factory=list) obj = Foo("foo") assert obj.name == "foo" with pytest.warns( UserWarning, match=r"Names \['all', 'is_uniform', 'signals'\] are reserved" ): group = obj.events assert isinstance(group, SignalGroup) assert "name" in group assert isinstance(group.name, SignalInstance) assert group["name"] is group.name assert "is_uniform" in group and isinstance(group["is_uniform"], SignalInstance) assert "signals" in group and isinstance(group["signals"], SignalInstance) # group.all is always a relay assert isinstance(group.all, SignalRelay) # getitem returns the signal assert "all" in group and isinstance(group["all"], SignalInstance) assert not isinstance(group["all"], SignalRelay) with pytest.raises(AttributeError): # it's not writeable group.all = SignalRelay({}) # type: ignore assert group.psygnals_uniform() is False @evented @dataclass class Foo2: psygnals_uniform: bool = True obj2 = Foo2() with pytest.warns(match=r"Name \['psygnals_uniform'\] is reserved"): _ = obj2.events @dataclass class Foo3: field: int = 1 _psygnal_signals: str = "signals" with pytest.raises( TypeError, match="Fields on an evented class cannot start with '_psygnal'" ): _ = evented(Foo3) def test_nesting() -> None: from dataclasses import dataclass, field @evented @dataclass class Foo: x: int = 1 @evented @dataclass class Bar: y: int = 2 foo: Foo = field(default_factory=Foo) # @evented(connect_child_events=True) # could also use this syntax @dataclass class Baz: events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor( connect_child_events=True ) z: int = 3 bar: Bar = field(default_factory=Bar) baz = Baz() mock = Mock() events: SignalGroup = baz.events events.all.connect(mock) baz.bar.foo.x = 3 # trigger nested event # what we expect expected = EmissionInfo( baz.bar.foo.events.x, (3, 1), path=( PathStep(attr="bar"), # Baz → bar PathStep(attr="foo"), # bar → foo PathStep(attr="x"), # foo → x (added by SignalRelay inside Foo) ), ) mock.assert_called_with(expected) def test_signal_relay_partial(): """Test hash and eq methods on _relay_partial objects""" class T(SignalGroup): sig = Signal(int) t = T() a = set() a.add(t.all._relay_partial(PathStep(attr="some_name"))) a.add(t.all._relay_partial(PathStep(attr="some_name"))) assert len(a) == 1 assert t.all._relay_partial(PathStep(attr="some_name")) in a def test_evented_object_replacement_disconnects_old_connections(): """Test that replacing evented objects properly disconnects the old one.""" @evented @dataclass class A: x: int = 1 @dataclass class M: events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor( connect_child_events=True ) d: A = field(default_factory=A) m = M() # Connect to the main events main_mock = Mock() m.events.connect(main_mock) # Get references to the original and new evented objects original_d = m.d new_d = A(x=99) # Connect directly to the original object's events to verify disconnection original_mock = Mock() original_d.events.connect(original_mock) # Replace the evented object m.d = new_d # Verify the replacement was detected by the main events assert main_mock.call_count == 1 replacement_info = cast("EmissionInfo", main_mock.call_args[0][0]) assert replacement_info.args == (new_d, original_d) assert replacement_info.path == (PathStep(attr="d"),) # Now modify the NEW object - should trigger events through the parent main_mock.reset_mock() new_d.x = 42 # The main events should receive the nested change assert main_mock.call_count == 1 nested_info = cast("EmissionInfo", main_mock.call_args[0][0]) assert nested_info.args == (42, 99) assert nested_info.path == (PathStep(attr="d"), PathStep(attr="x")) # Now modify the OLD object - should NOT trigger events through the parent # because it should have been disconnected main_mock.reset_mock() original_d.x = 123 # The original object's direct listeners should still work assert original_mock.call_count == 1 # But the main events should NOT have been triggered (disconnected) assert main_mock.call_count == 0 def test_lazy_child_connection() -> None: """Test that child events are only connected when parent is first connected to.""" @evented @dataclass class Child: y: int = 2 @evented @dataclass class Parent: child: Child x: int = 1 # Create parent with child child = Child() parent = Parent(child=child) # Before any connections, neither should have relay slots assert len(parent.events.all) == 0 assert len(child.events.all) == 0 mock = Mock() parent.events.all.connect(mock) # After connection # parent should have our listener, child should have relay connection assert len(parent.events.all) == 1 assert len(child.events.all) == 1 # Test that child events propagate to parent child.y = 10 mock.assert_called_once_with( EmissionInfo( child.events.y, (10, 2), path=(PathStep(attr="child"), PathStep(attr="y")) ) ) def test_team_example() -> None: @evented @dataclass class Person: name: str = "" age: int = 0 @evented @dataclass class Team: name: str = "" leader: Person = field(default_factory=Person) team = Team() # This will trigger the listener above team_level_info = EmissionInfo( team.leader.events.name, ("Alice", ""), path=(PathStep(attr="leader"), PathStep(attr="name")), ) team_leader_level_info = EmissionInfo( team.leader.events.name, ("Alice", ""), path=(PathStep(attr="name"),) ) with ( testing.assert_emitted_once_with(team.events.all, team_level_info), testing.assert_emitted_once_with( team.leader.events.all, team_leader_level_info ), testing.assert_emitted_once_with(team.leader.events.name, "Alice", ""), testing.assert_not_emitted(team.events.leader), ): team.leader.name = "Alice" def test_signal_instance_emits_on_subevents() -> None: @evented @dataclass class Person: name: str = "" age: int = 0 @evented @dataclass class Team: name: str = "" leader: Person = field(default_factory=Person) team = Team(name="A-Team", leader=Person(name="Hannibal", age=59)) mock = Mock() team.events.leader.connect(mock, emit_on_evented_child_events=True) team.leader.age = 60 mock.assert_called_once_with(Person(name="Hannibal", age=60), None) psygnal-0.15.0/tests/test_evented_model.py0000644000000000000000000007265515073705675015623 0ustar00import inspect import sys from collections.abc import Sequence from contextlib import nullcontext from typing import Any, ClassVar, Protocol, runtime_checkable from unittest.mock import Mock, call, patch import numpy as np import pytest from pydantic import Field from psygnal import EventedModel from psygnal.containers import EventedList from psygnal.containers._evented_dict import EventedDict from psygnal.containers._evented_set import EventedSet try: from pydantic import PrivateAttr except ImportError: pytest.skip("pydantic not installed", allow_module_level=True) import pydantic.version from pydantic import BaseModel from psygnal import EmissionInfo from psygnal._group import SignalGroup from psygnal._signal import ReemissionMode PYDANTIC_V2 = pydantic.version.VERSION.startswith("2") try: from pydantic import field_serializer except ImportError: def field_serializer(*args, **kwargs): def decorator(cls): return cls return decorator def asdict(obj: "BaseModel") -> dict: if PYDANTIC_V2: return obj.model_dump() else: return obj.dict() def asjson(obj: BaseModel) -> str: if PYDANTIC_V2: return obj.model_dump_json() else: return obj.json() def test_creating_empty_evented_model(): """Test creating an empty evented pydantic model.""" model = EventedModel() assert model is not None assert model.events is not None def test_evented_model(): """Test creating an evented pydantic model.""" class User(EventedModel): id: int name: str = "Alex" age: ClassVar[int] = 100 user = User(id=0) # test basic functionality assert user.id == 0 assert user.name == "Alex" user.id = 2 assert user.id == 2 # test event system assert isinstance(user.events, SignalGroup) assert "id" in user.events assert "name" in user.events # ClassVars are excluded from events assert "age" not in user.events id_mock = Mock() name_mock = Mock() user.events.id.connect(id_mock) user.events.name.connect(name_mock) # setting an attribute should, by default, emit an event with the value user.id = 4 id_mock.assert_called_with(4) name_mock.assert_not_called() # and event should only be emitted when the value has changed. id_mock.reset_mock() user.id = 4 id_mock.assert_not_called() name_mock.assert_not_called() def test_evented_model_array_updates(): """Test updating an evented pydantic model with an array.""" class Model(EventedModel): """Demo evented model.""" values: np.ndarray if PYDANTIC_V2: model_config = {"arbitrary_types_allowed": True} else: class Config: arbitrary_types_allowed = True first_values = np.array([1, 2, 3]) model = Model(values=first_values) # Mock events values_mock = Mock() model.events.values.connect(values_mock) np.testing.assert_almost_equal(model.values, first_values) # Updating with new data new_array = np.array([1, 2, 4]) model.values = new_array np.testing.assert_array_equal(values_mock.call_args.args[0], new_array) values_mock.reset_mock() # Updating with same data, no event should be emitted model.values = new_array values_mock.assert_not_called() def test_evented_model_np_array_equality(): """Test checking equality with an evented model with direct numpy.""" class Model(EventedModel): values: np.ndarray if PYDANTIC_V2: model_config = {"arbitrary_types_allowed": True} else: class Config: arbitrary_types_allowed = True model1 = Model(values=np.array([1, 2, 3])) model2 = Model(values=np.array([1, 5, 6])) assert model1 == model1 assert model1 != model2 model2.values = np.array([1, 2, 3]) assert model1 == model2 def test_evented_model_da_array_equality(): """Test checking equality with an evented model with direct dask.""" da = pytest.importorskip("dask.array") class Model(EventedModel): values: da.Array if PYDANTIC_V2: model_config = {"arbitrary_types_allowed": True} else: class Config: arbitrary_types_allowed = True r = da.ones((64, 64)) model1 = Model(values=r) model2 = Model(values=da.ones((64, 64))) assert model1 == model1 # dask arrays will only evaluate as equal if they are the same object. assert model1 != model2 model2.values = r assert model1 == model2 def test_values_updated() -> None: class User(EventedModel): """Demo evented model. Parameters ---------- id : int User id. name : str, optional User name. """ id: int user_name: str = "A" age: ClassVar[int] = 100 user1 = User(id=0) user2 = User(id=1, user_name="K") # Check user1 and user2 dicts assert asdict(user1) == {"id": 0, "user_name": "A"} assert asdict(user2) == {"id": 1, "user_name": "K"} # Add mocks user1_events = Mock() u1_id_events = Mock() u2_id_events = Mock() user1.events.all.connect(user1_events) user1.events.all.connect(user1_events) user1.events.id.connect(u1_id_events) user2.events.id.connect(u2_id_events) user1.events.id.connect(u1_id_events) user2.events.id.connect(u2_id_events) # Update user1 from user2 user1.update(user2) assert asdict(user1) == {"id": 1, "user_name": "K"} u1_id_events.assert_called_with(1) u2_id_events.assert_not_called() # NOTE: # user.events.user_name is NOT actually emitted because it has no callbacks # connected to it. see test_comparison_count below... user1_events.assert_has_calls( [ call(EmissionInfo(signal=user1.events.id, args=(1,))), # call(EmissionInfo(signal=user1.events.user_name, args=("K",))), ] ) u1_id_events.reset_mock() u2_id_events.reset_mock() user1_events.reset_mock() # Update user1 from user2 again, no event emission expected user1.update(user2) assert asdict(user1) == {"id": 1, "user_name": "K"} u1_id_events.assert_not_called() u2_id_events.assert_not_called() assert user1_events.call_count == 0 def test_update_with_inner_model_union(): class Inner(EventedModel): w: str class AltInner(EventedModel): x: str class Outer(EventedModel): y: int z: Inner | AltInner original = Outer(y=1, z=Inner(w="a")) updated = Outer(y=2, z=AltInner(x="b")) original.update(updated, recurse=False) assert original == updated def test_update_with_inner_model_protocol(): @runtime_checkable class InnerProtocol(Protocol): def string(self) -> str: ... # Protocol fields are not successfully set without explicit validation. @classmethod def __get_validators__(cls): yield cls.validate @classmethod def __get_pydantic_core_schema__(cls, _source_type: Any, _handler: Any): from pydantic_core import core_schema return core_schema.no_info_plain_validator_function(cls.validate) @classmethod def validate(cls, v): return v class Inner(EventedModel): w: str def string(self) -> str: return self.w class AltInner(EventedModel): x: str def string(self) -> str: return self.x class Outer(EventedModel): y: int z: InnerProtocol original = Outer(y=1, z=Inner(w="a")) updated = Outer(y=2, z=AltInner(x="b")) original.update(updated, recurse=False) assert original == updated def test_evented_model_signature(): class T(EventedModel): x: int y: str = "yyy" z: bytes = b"zzz" assert isinstance(T.__signature__, inspect.Signature) sig = inspect.signature(T) assert str(sig) == "(*, x: int, y: str = 'yyy', z: bytes = b'zzz') -> None" class MyObj: def __init__(self, a: int, b: str) -> None: self.a = a self.b = b @classmethod def __get_validators__(cls): yield cls.validate_type @classmethod def __get_pydantic_core_schema__(cls, _source_type: Any, _handler: Any): from pydantic_core import core_schema return core_schema.no_info_plain_validator_function(cls.validate_type) @classmethod def validate_type(cls, val): # turn a generic dict into object if isinstance(val, dict): a = val.get("a") b = val.get("b") elif isinstance(val, MyObj): return val # perform additional validation here return cls(a, b) def __eq__(self, other): return self.__dict__ == other.__dict__ def _json_encode(self): return self.__dict__ def test_evented_model_serialization(): class Model(EventedModel): """Demo evented model.""" obj: MyObj @field_serializer("obj") def serialize_dt(self, dt: MyObj) -> dict: return dt.__dict__ m = Model(obj=MyObj(1, "hi")) raw = asjson(m) if PYDANTIC_V2: assert raw == '{"obj":{"a":1,"b":"hi"}}' deserialized = Model.model_validate_json(raw) else: assert raw == '{"obj": {"a": 1, "b": "hi"}}' deserialized = Model.parse_raw(raw) assert deserialized == m def test_nested_evented_model_serialization(): """Test that encoders on nested sub-models can be used by top model.""" class NestedModel(EventedModel): obj: MyObj @field_serializer("obj") def serialize_dt(self, dt: MyObj) -> dict: return dt.__dict__ class Model(EventedModel): nest: NestedModel m = Model(nest={"obj": {"a": 1, "b": "hi"}}) raw = asjson(m) if PYDANTIC_V2: assert raw == r'{"nest":{"obj":{"a":1,"b":"hi"}}}' deserialized = Model.model_validate_json(raw) else: assert raw == r'{"nest": {"obj": {"a": 1, "b": "hi"}}}' deserialized = Model.parse_raw(raw) assert deserialized == m def test_evented_model_dask_delayed(): """Test that evented models work with dask delayed objects""" dd = pytest.importorskip("dask.delayed") dask = pytest.importorskip("dask") class MyObject(EventedModel): attribute: dd.Delayed if PYDANTIC_V2: model_config = {"arbitrary_types_allowed": True} else: class Config: arbitrary_types_allowed = True @dask.delayed def my_function(): pass o1 = MyObject(attribute=my_function) # check that equality checking works as expected assert o1 == o1 class T(EventedModel): a: int = 1 b: int = 1 @property def c(self) -> list[int]: return [self.a, self.b] @c.setter def c(self, val: Sequence[int]): self.a, self.b = val if PYDANTIC_V2: model_config = { "allow_property_setters": True, "guess_property_dependencies": True, } else: class Config: allow_property_setters = True guess_property_dependencies = True def test_defaults(): class R(EventedModel): x: str = "hi" default_r = R() class D(EventedModel): a: int = 1 b: int = 1 r: R = default_r d = D() assert d._defaults == {"a": 1, "b": 1, "r": default_r} d.update({"a": 2, "r": {"x": "asdf"}}, recurse=True) assert asdict(d) == {"a": 2, "b": 1, "r": {"x": "asdf"}} assert asdict(d) != d._defaults d.reset() assert asdict(d) == d._defaults @pytest.mark.skipif(PYDANTIC_V2, reason="enum values seem broken on pydantic") def test_enums_as_values(): from enum import Enum class MyEnum(Enum): A = "value" class SomeModel(EventedModel): a: MyEnum = MyEnum.A m = SomeModel() assert asdict(m) == {"a": MyEnum.A} with m.enums_as_values(): assert asdict(m) == {"a": "value"} assert asdict(m) == {"a": MyEnum.A} def test_properties_with_explicit_property_dependencies(): class MyModel(EventedModel): a: int = 1 b: int = 1 @property def c(self) -> list[int]: return [self.a, self.b] @c.setter def c(self, val: Sequence[int]) -> None: self.a, self.b = val if PYDANTIC_V2: model_config = { "allow_property_setters": True, "field_dependencies": {"c": ["a", "b"]}, } else: class Config: allow_property_setters = True field_dependencies = {"c": ["a", "b"]} assert list(MyModel.__property_setters__) == ["c"] # the metaclass should have figured out that both a and b affect c assert MyModel.__field_dependents__ == {"a": {"c"}, "b": {"c"}} def test_evented_model_with_property_setters(): t = T() assert list(T.__property_setters__) == ["c"] # the metaclass should have figured out that both a and b affect c assert T.__field_dependents__ == {"a": {"c"}, "b": {"c"}} # all the fields and properties behave as expected assert t.c == [1, 1] t.a = 4 assert t.c == [4, 1] t.c = [2, 3] assert t.c == [2, 3] assert t.a == 2 assert t.b == 3 def test_evented_model_with_property_setters_events(): t = T() assert "c" in t.events # the setter has an event mock_a = Mock() mock_b = Mock() mock_c = Mock() t.events.a.connect(mock_a) t.events.b.connect(mock_b) t.events.c.connect(mock_c) # setting t.c emits events for all three a, b, and c t.c = [10, 20] mock_a.assert_called_with(10) mock_b.assert_called_with(20) mock_c.assert_called_with([10, 20]) assert t.a == 10 assert t.b == 20 mock_a.reset_mock() mock_b.reset_mock() mock_c.reset_mock() # setting t.a emits events for a and c, but not b # this is because we declared c to be dependent on ['a', 'b'] t.a = 5 mock_a.assert_called_with(5) mock_c.assert_called_with([5, 20]) mock_b.assert_not_called() assert t.c == [5, 20] mock_a.reset_mock() t.a = 5 # no change, no events mock_a.assert_not_called() def test_non_setter_with_dependencies() -> None: with pytest.raises( ValueError, match=r"Fields with dependencies must be fields or property.setters" ): class M(EventedModel): x: int @property def y(self): ... @y.setter def y(self, v): ... if PYDANTIC_V2: model_config = { "allow_property_setters": True, "field_dependencies": {"a": []}, } else: class Config: allow_property_setters = True field_dependencies = {"a": []} def test_unrecognized_property_dependencies(): with pytest.warns(UserWarning, match="cannot depend on unrecognized attribute"): class M(EventedModel): x: int @property def y(self): ... @y.setter def y(self, v): ... if PYDANTIC_V2: model_config = { "allow_property_setters": True, "field_dependencies": {"y": ["b"]}, } else: class Config: allow_property_setters = True field_dependencies = {"y": ["b"]} @pytest.mark.skipif(PYDANTIC_V2, reason="pydantic 2 does not support this") def test_setattr_before_init(): class M(EventedModel): _x: int = PrivateAttr() def __init__(_model_self_, x: int, **data) -> None: _model_self_._x = x super().__init__(**data) @property def x(self) -> int: return self._x m = M(x=2) assert m.x == 2 def test_setter_inheritance(): class M(EventedModel): _x: int = PrivateAttr() def __init__(self, x: int, **data: Any) -> None: super().__init__(**data) self.x = x @property def x(self) -> int: return self._x @x.setter def x(self, v: int) -> None: self._x = v if PYDANTIC_V2: model_config = {"allow_property_setters": True} else: class Config: allow_property_setters = True assert M(x=2).x == 2 class N(M): ... assert N(x=2).x == 2 with pytest.raises(ValueError, match="Cannot set 'allow_property_setters' to"): class Bad(M): if PYDANTIC_V2: model_config = {"allow_property_setters": False} else: class Config: allow_property_setters = False def test_derived_events() -> None: class Model(EventedModel): a: int @property def b(self) -> int: return self.a + 1 @b.setter def b(self, b: int) -> None: self.a = b - 1 if PYDANTIC_V2: model_config = { "allow_property_setters": True, "field_dependencies": {"b": ["a"]}, } else: class Config: allow_property_setters = True field_dependencies = {"b": ["a"]} mock_a = Mock() mock_b = Mock() m = Model(a=0) m.events.a.connect(mock_a) m.events.b.connect(mock_b) m.b = 3 mock_a.assert_called_once_with(2) mock_b.assert_called_once_with(3) def test_root_validator_events(): class Model(EventedModel): x: int y: int if PYDANTIC_V2: from pydantic import model_validator model_config = { "validate_assignment": True, "field_dependencies": {"y": ["x"]}, } @model_validator(mode="before") def check(cls, values: dict) -> dict: x = values["x"] values["y"] = min(values["y"], x) return values else: from pydantic import root_validator class Config: validate_assignment = True field_dependencies = {"y": ["x"]} @root_validator def check(cls, values: dict) -> dict: x = values["x"] values["y"] = min(values["y"], x) return values m = Model(x=2, y=1) xmock = Mock() ymock = Mock() m.events.x.connect(xmock) m.events.y.connect(ymock) m.x = 0 assert m.y == 0 xmock.assert_called_once_with(0) ymock.assert_called_once_with(0) xmock.reset_mock() ymock.reset_mock() m.x = 2 assert m.y == 0 xmock.assert_called_once_with(2) ymock.assert_not_called() def test_deprecation() -> None: with pytest.warns(DeprecationWarning, match="Use 'field_dependencies' instead"): class MyModel(EventedModel): a: int = 1 b: int = 1 if PYDANTIC_V2: model_config = {"property_dependencies": {"a": ["b"]}} else: class Config: property_dependencies = {"a": ["b"]} assert MyModel.__field_dependents__ == {"b": {"a"}} def test_comparison_count() -> None: """Test that we only compare fields that are actually connected to events.""" class Model(EventedModel): a: int @property def b(self) -> int: return self.a + 1 @b.setter def b(self, b: int) -> None: self.a = b - 1 if PYDANTIC_V2: model_config = { "allow_property_setters": True, "field_dependencies": {"b": ["a"]}, } else: class Config: allow_property_setters = True field_dependencies = {"b": ["a"]} # pick whether to mock v1 or v2 modules model_module = sys.modules[type(Model).__module__] m = Model(a=0) b_mock = Mock() with patch.object( model_module, "_check_field_equality", wraps=model_module._check_field_equality, ) as check_mock: m.a = 1 check_mock.assert_not_called() b_mock.assert_not_called() m.events.b.connect(b_mock) with patch.object( model_module, "_check_field_equality", wraps=model_module._check_field_equality, ) as check_mock: m.a = 3 check_mock.assert_has_calls([call(Model, "a", 3, 1), call(Model, "b", 4, 2)]) b_mock.assert_called_once_with(4) def test_connect_only_to_events() -> None: """Make sure that we still make comparison and emit events when connecting only to the events group itself.""" class Model(EventedModel): a: int # pick whether to mock v1 or v2 modules model_module = sys.modules[type(Model).__module__] m = Model(a=0) mock1 = Mock() with patch.object( model_module, "_check_field_equality", wraps=model_module._check_field_equality, ) as check_mock: m.a = 1 check_mock.assert_not_called() mock1.assert_not_called() m.events.all.connect(mock1) with patch.object( model_module, "_check_field_equality", wraps=model_module._check_field_equality, ) as check_mock: m.a = 3 check_mock.assert_has_calls([call(Model, "a", 3, 1)]) mock1.assert_called_once() def test_if_event_is_emitted_only_once() -> None: """Check if, for complex property setters, the event is emitted only once.""" class SampleClass(EventedModel): a: int = 1 b: int = 2 if PYDANTIC_V2: model_config = { "allow_property_setters": True, "guess_property_dependencies": True, } else: class Config: allow_property_setters = True guess_property_dependencies = True @property def c(self): return self.a + self.b @c.setter def c(self, value): self.a = value - self.b @property def d(self): return self.a + self.b @d.setter def d(self, value): self.a = value // 2 self.b = value - self.a s = SampleClass() a_m = Mock() c_m = Mock() d_m = Mock() s.events.a.connect(a_m) s.events.c.connect(c_m) s.events.d.connect(d_m) s.d = 5 a_m.assert_called_once() c_m.assert_called_once() d_m.assert_called_once() @pytest.mark.parametrize( "mode", [ ReemissionMode.IMMEDIATE, ReemissionMode.QUEUED, {"a": ReemissionMode.IMMEDIATE, "b": ReemissionMode.QUEUED}, {"a": ReemissionMode.IMMEDIATE, "b": "err"}, {"a": ReemissionMode.QUEUED}, {}, "err", ], ) def test_evented_model_reemission(mode: str | dict) -> None: err = mode == "err" or (isinstance(mode, dict) and "err" in mode.values()) with ( pytest.raises(ValueError, match="Invalid reemission") if err else nullcontext() ): class Model(EventedModel): a: int b: int if PYDANTIC_V2: model_config = {"reemission": mode} else: class Config: reemission = mode if err: return m = Model(a=1, b=2) if isinstance(mode, dict): assert m.events.a._reemission == mode.get("a", ReemissionMode.LATEST) assert m.events.b._reemission == mode.get("b", ReemissionMode.LATEST) else: assert m.events.a._reemission == mode assert m.events.b._reemission == mode @pytest.mark.skipif(not PYDANTIC_V2, reason="computed_field added in v2") def test_computed_field() -> None: from pydantic import computed_field class MyModel(EventedModel): a: int = 1 b: int = 1 @computed_field @property def c(self) -> list[int]: return [self.a, self.b] @c.setter def c(self, val: Sequence[int]) -> None: self.a, self.b = val model_config = { "allow_property_setters": True, "field_dependencies": {"c": ["a", "b"]}, # type: ignore [typeddict-unknown-key] } mock_a = Mock() mock_b = Mock() mock_c = Mock() m = MyModel() m.events.a.connect(mock_a) m.events.b.connect(mock_b) m.events.c.connect(mock_c) m.c = [10, 20] mock_a.assert_called_with(10) mock_b.assert_called_with(20) mock_c.assert_called_with([10, 20]) mock_a.reset_mock() mock_c.reset_mock() m.a = 5 mock_a.assert_called_with(5) mock_c.assert_called_with([5, 20]) @pytest.mark.skipif(not PYDANTIC_V2, reason="computed_field added in v2") def test_private_field_dependents(): from pydantic import PrivateAttr, computed_field from psygnal import EventedModel class MyModel(EventedModel): _items_dict: dict[str, int] = PrivateAttr(default_factory=dict) @computed_field # type: ignore [prop-decorator] @property def item_names(self) -> list[str]: return list(self._items_dict) @computed_field # type: ignore [prop-decorator] @property def item_sum(self) -> int: return sum(self._items_dict.values()) def add_item(self, name: str, value: int) -> None: if name in self._items_dict: raise ValueError(f"Name {name} already exists!") self._items_dict = {**self._items_dict, name: value} # Ideally the following would work model_config = { "field_dependencies": { # type: ignore [typeddict-unknown-key] "item_names": ["_items_dict"], "item_sum": ["_items_dict"], } } m = MyModel() item_sum_mock = Mock() item_names_mock = Mock() m.events.item_sum.connect(item_sum_mock) m.events.item_names.connect(item_names_mock) m.add_item("a", 1) item_sum_mock.assert_called_once_with(1) item_names_mock.assert_called_once_with(["a"]) item_sum_mock.reset_mock() item_names_mock.reset_mock() m.add_item("b", 2) item_sum_mock.assert_called_once_with(3) item_names_mock.assert_called_once_with(["a", "b"]) # check direct access ass well item_sum_mock.reset_mock() m._items_dict = m._items_dict.copy() assert item_sum_mock.call_count == 0 m._items_dict = {} item_sum_mock.assert_called_once_with(0) def test_primary_vs_dependent_optimization() -> None: """Test that dependent properties are not checked when no primary changes occur. This test verifies the optimization where, if no primary fields actually change values, the system skips checking dependent properties entirely. The current implementation processes primary changes first, and if none of them result in actual value changes, it skips checking dependent properties entirely. """ class Model(EventedModel): a: int = 1 @property def b(self) -> int: return self.a + 1 @b.setter def b(self, b: int) -> None: self.a = b - 1 if PYDANTIC_V2: model_config = { "allow_property_setters": True, "field_dependencies": {"b": ["a"]}, } else: class Config: allow_property_setters = True field_dependencies = {"b": ["a"]} # pick whether to mock v1 or v2 modules model_module = sys.modules[type(Model).__module__] m = Model(a=5) # Connect listeners to both primary field and dependent property # so that the system has reason to track changes mock_a = Mock() mock_b = Mock() m.events.a.connect(mock_a) m.events.b.connect(mock_b) # Set field 'a' to the same value it already has (no actual change) with patch.object( model_module, "_check_field_equality", wraps=model_module._check_field_equality, ) as check_mock: m.a = 5 # Same value, no change # implementation should only check the primary field 'a' # and skip checking dependent property 'b' since no primary change occurred check_mock.assert_has_calls([call(Model, "a", 5, 5)]) assert check_mock.call_count == 1, ( f"Expected 1 equality check, got {check_mock.call_count}" ) # No events should be emitted since no actual changes occurred mock_a.assert_not_called() mock_b.assert_not_called() @pytest.mark.skipif(not PYDANTIC_V2, reason="v2 serialization features") def test_serialization_and_schema(): class TestModel(EventedModel): name: str elist_of_str: EventedList[str] = Field(default_factory=EventedList) eset_of_str: EventedSet[str] = Field(default_factory=EventedSet) edict_of_str: EventedDict[str, str] = Field(default_factory=EventedDict) model = TestModel(name="Test") assert isinstance(model.elist_of_str, EventedList) assert isinstance(model.eset_of_str, EventedSet) assert isinstance(model.edict_of_str, EventedDict) dumped = model.model_dump(mode="python") assert isinstance(dumped["elist_of_str"], EventedList) assert isinstance(dumped["eset_of_str"], EventedSet) assert isinstance(dumped["edict_of_str"], EventedDict) dumped = model.model_dump(mode="json") assert isinstance(dumped["elist_of_str"], list) assert isinstance(dumped["eset_of_str"], list) assert isinstance(dumped["edict_of_str"], dict) json_dump = model.model_dump_json() assert isinstance(json_dump, str) assert TestModel.model_json_schema(mode="serialization") assert TestModel.model_json_schema(mode="validation") psygnal-0.15.0/tests/test_group.py0000644000000000000000000003114115073705675014126 0ustar00from __future__ import annotations from copy import deepcopy from typing import TYPE_CHECKING, Annotated from unittest.mock import Mock, call import pytest import psygnal from psygnal import EmissionInfo, Signal, SignalGroup, SignalInstance from psygnal._group import SignalRelay if TYPE_CHECKING: from collections.abc import Callable class MyGroup(SignalGroup): sig1 = Signal(int) sig2 = Signal(str) with pytest.warns(): class ConflictGroup(SignalGroup): sig1 = Signal(int) connect = Signal(int) # type: ignore def test_cannot_instantiate_group() -> None: with pytest.raises(TypeError, match="Cannot instantiate `SignalGroup` directly"): SignalGroup() def test_signal_group() -> None: assert not MyGroup.psygnals_uniform() with pytest.warns( FutureWarning, match="The `is_uniform` method on SignalGroup is deprecated" ): assert not MyGroup.is_uniform() group = MyGroup() assert not group.psygnals_uniform() assert list(group) == ["sig1", "sig2"] # testing __iter__ assert group.sig1 is group["sig1"] assert set(group.signals) == {"sig1", "sig2"} assert repr(group) == "" def test_uniform_group() -> None: """In a uniform group, all signals must have the same signature.""" class MyStrictGroup(SignalGroup, strict=True): sig1 = Signal(int) sig2 = Signal(int) assert MyStrictGroup.psygnals_uniform() group = MyStrictGroup() assert group.psygnals_uniform() assert set(group) == {"sig1", "sig2"} with pytest.raises(TypeError) as e: class BadGroup(SignalGroup, strict=True): sig1 = Signal(str) sig2 = Signal(int) assert str(e.value).startswith("All Signals in a strict SignalGroup must") @pytest.mark.skipif(Annotated is None, reason="requires typing.Annotated") def test_nonhashable_args() -> None: """Test that non-hashable annotations are allowed in a SignalGroup""" class MyGroup(SignalGroup): sig1 = Signal(Annotated[int, {"a": 1}]) # type: ignore sig2 = Signal(Annotated[float, {"b": 1}]) # type: ignore assert not MyGroup.psygnals_uniform() with pytest.raises(TypeError): class MyGroup2(SignalGroup, strict=True): sig1 = Signal(Annotated[int, {"a": 1}]) # type: ignore sig2 = Signal(Annotated[float, {"b": 1}]) # type: ignore @pytest.mark.parametrize("direct", [True, False]) def test_signal_group_connect(direct: bool) -> None: mock = Mock() group = MyGroup() if direct: # the callback wants the emitted arguments directly group.connect_direct(mock) else: # the callback will receive an EmissionInfo tuple # (SignalInstance, arg_tuple) group.connect(mock) group.sig1.emit(1) group.sig2.emit("hi") assert mock.call_count == 2 # if connect_with_info was used, the callback will be given an EmissionInfo # tuple that contains the args as well as the signal instance used if direct: expected_calls = [call(1), call("hi")] else: expected_calls = [ call(EmissionInfo(group.sig1, (1,))), call(EmissionInfo(group.sig2, ("hi",))), ] mock.assert_has_calls(expected_calls) def test_signal_group_connect_no_args() -> None: """Test that group.all.connect can take a callback that wants no args""" group = MyGroup() count = [] def my_slot() -> None: count.append(1) group.connect(my_slot) group.sig1.emit(1) group.sig2.emit("hi") assert len(count) == 2 def test_group_blocked() -> None: group = MyGroup() mock1 = Mock() mock2 = Mock() group.connect(mock1) group.sig1.connect(mock2) group.sig1.emit(1) mock1.assert_called_once_with(EmissionInfo(group.sig1, (1,))) mock2.assert_called_once_with(1) mock1.reset_mock() mock2.reset_mock() group.sig2.block() assert group.sig2._is_blocked with group.all.blocked(): group.sig1.emit(1) assert group.sig1._is_blocked assert not group.sig1._is_blocked # the blocker should have restored subthings to their previous states assert group.sig2._is_blocked mock1.assert_not_called() mock2.assert_not_called() def test_group_blocked_exclude() -> None: """Test that we can exempt certain signals from being blocked.""" group = MyGroup() mock1 = Mock() mock2 = Mock() group.sig1.connect(mock1) group.sig2.connect(mock2) with group.all.blocked(exclude=("sig2",)): group.sig1.emit(1) group.sig2.emit("hi") mock1.assert_not_called() mock2.assert_called_once_with("hi") def test_group_disconnect_single_slot() -> None: """Test that we can disconnect single slots from groups.""" group = MyGroup() mock1 = Mock() mock2 = Mock() group.sig1.connect(mock1) group.sig2.connect(mock2) group.disconnect(mock1) group.sig1.emit() mock1.assert_not_called() group.sig2.emit() mock2.assert_called_once() def test_group_disconnect_all_slots() -> None: """Test that we can disconnect all slots from groups.""" group = MyGroup() mock1 = Mock() mock2 = Mock() group.sig1.connect(mock1) group.sig2.connect(mock2) group.disconnect() group.sig1.emit() group.sig2.emit() mock1.assert_not_called() mock2.assert_not_called() def test_weakref() -> None: """Make sure that the group doesn't keep a strong reference to the instance.""" import gc class T: ... obj = T() group = MyGroup(obj) assert group.all.instance is obj del obj gc.collect() assert group.all.instance is None @pytest.mark.parametrize( "Group, signame, get_sig", [ (MyGroup, "sig1", getattr), (MyGroup, "sig1", SignalGroup.__getitem__), (ConflictGroup, "sig1", getattr), (ConflictGroup, "sig1", SignalGroup.__getitem__), (ConflictGroup, "connect", SignalGroup.__getitem__), ], ) def test_group_deepcopy( Group: type[SignalGroup], signame: str, get_sig: Callable ) -> None: """_summary_ Parameters ---------- Group : type[SignalGroup] The group class to test, where ConflictGroup has a signal named "connect" which conflicts with the SignalGroup method of the same name. signame : str The name of the signal to test get_sig : Callable The method to use to get the signal instance from the group. we don't test getattr for ConflictGroup because it has a signal named "connect" """ class T: def method(self) -> None: ... obj = T() group = Group(obj) assert deepcopy(group) is not group # but no warning group.connect(obj.method) group2 = deepcopy(group) assert not len(group2.all) mock = Mock() mock2 = Mock() group.connect(mock) group2.connect(mock2) # test that we can access signalinstances (either using getattr or __getitem__) siginst1 = get_sig(group, signame) siginst2 = get_sig(group2, signame) assert isinstance(siginst1, SignalInstance) assert isinstance(siginst2, SignalInstance) assert siginst1 is not siginst2 # test that emitting from the deepcopied group doesn't affect the original siginst2.emit(1) mock.assert_not_called() mock2.assert_called_with(EmissionInfo(siginst2, (1,))) # test that emitting from the original group doesn't affect the deepcopied one mock2.reset_mock() siginst1.emit(1) mock.assert_called_with(EmissionInfo(siginst1, (1,))) mock2.assert_not_called() def test_group_conflicts() -> None: with pytest.warns(UserWarning, match=r"Name \['connect'\] is reserved"): class MyGroup(SignalGroup): connect = Signal(int) # type: ignore other_signal = Signal(int) class SubGroup(MyGroup): sig4 = Signal(int) assert "connect" in MyGroup._psygnal_signals assert "other_signal" in MyGroup._psygnal_signals group = MyGroup() assert isinstance(group["connect"], SignalInstance) assert not isinstance(group.connect, SignalInstance) with pytest.raises( TypeError, match="SignalGroup subclass cannot have attributes starting with '_psygnal'", ): class MyGroup2(SignalGroup): _psygnal_private = 1 assert group.other_signal.name == "other_signal" assert group["connect"].name == "connect" subgroup = SubGroup() assert subgroup["connect"].name == "connect" assert subgroup.other_signal.name == "other_signal" def test_group_iter() -> None: class Group1(SignalGroup): sig1 = Signal() sig2 = Signal() sig3 = Signal() # Delete Signal on Group class # You should never do that del Group1._psygnal_signals["sig1"] assert set(Group1._psygnal_signals) == {"sig2", "sig3"} assert hasattr(Group1, "sig1") g = Group1() # Delete Signal on Group instance # You should never do that del g._psygnal_signals["sig2"] assert "sig1" not in g assert "sig2" in g assert set(g) == {"sig2", "sig3"} with pytest.raises(KeyError): g["sig1"] sig1_t = g.sig1 assert isinstance(sig1_t, SignalInstance) assert sig1_t.name == "sig1" sig2 = g["sig2"] assert isinstance(sig2, SignalInstance) assert sig2.name == "sig2" sig2_t = g.sig2 assert isinstance(sig2_t, SignalInstance) assert sig2_t.name == "sig2" # Delete SignalInstance del g._psygnal_instances["sig3"] assert "sig3" not in g assert set(g) == {"sig2"} with pytest.raises(KeyError): g["sig3"] def test_group_subclass() -> None: # Signals are passed to sub-classes class Group1(SignalGroup): sig1 = Signal() class Group2(Group1): sig2 = Signal() assert "sig1" in Group1._psygnal_signals assert "sig1" in Group2._psygnal_signals assert "sig2" in Group2._psygnal_signals assert "sig2" not in Group1._psygnal_signals assert hasattr(Group1, "sig1") and isinstance(Group1.sig1, Signal) assert hasattr(Group2, "sig1") and isinstance(Group2.sig1, Signal) assert hasattr(Group2, "sig2") and isinstance(Group2.sig2, Signal) assert not hasattr(Group1, "sig2") def test_delayed_relay_connect() -> None: group = MyGroup() mock = Mock() gmock = Mock() assert len(group.sig1) == 0 group.sig1.connect(mock) # group relay hasn't been connected to sig1 or sig2 yet assert len(group.sig1) == 1 assert len(group.sig2) == 0 group.all.connect(gmock) # NOW the relay is connected assert len(group.sig1) == 2 assert len(group.sig2) == 1 method = group.sig1._slots[-1].dereference() assert method assert method.__name__ == "_slot_relay" group.sig1.emit(1) mock.assert_called_once_with(1) gmock.assert_called_once_with(EmissionInfo(group.sig1, (1,))) group.all.disconnect(gmock) assert len(group.sig1) == 1 assert len(group.all) == 0 mock.reset_mock() gmock.reset_mock() group.sig1.emit(1) mock.assert_called_once_with(1) gmock.assert_not_called() @pytest.mark.skipif(psygnal._compiled, reason="requires uncompiled psygnal") def test_group_relay_signatures() -> None: from inspect import signature for name in dir(SignalGroup): if ( hasattr(SignalRelay, name) and not name.startswith("_") and callable(getattr(SignalRelay, name)) ): group_sig = signature(getattr(SignalGroup, name)) relay_sig = signature(getattr(SignalRelay, name)) assert group_sig == relay_sig def test_group_relay_passthrough() -> None: group = MyGroup() mock1 = Mock() mock2 = Mock() # test connection group.connect(mock1) group.all.connect(mock2) group.sig1.emit(1) mock1.assert_called_once_with(EmissionInfo(group.sig1, (1,))) mock2.assert_called_once_with(EmissionInfo(group.sig1, (1,))) mock1.reset_mock() mock2.reset_mock() # test disconnection group.disconnect(mock1) group.all.disconnect(mock2) group.sig1.emit("hi") mock1.assert_not_called() mock2.assert_not_called() @group.connect(check_nargs=True) # testing the decorator as well def _(x: int) -> None: mock1(x) group.all.connect(mock2) # test blocking with group.blocked(): group.sig1.emit(1) mock1.assert_not_called() mock2.assert_not_called() with group.all.blocked(): group.sig1.emit(1) mock1.assert_not_called() mock2.assert_not_called() # smoke test the rest group.connect_direct(mock1) group.block() group.unblock() group.blocked() group.pause() group.resume() group.paused() psygnal-0.15.0/tests/test_group_aliases.py0000644000000000000000000002506515073705675015637 0ustar00# from __future__ import annotations # breaks msgspec Annotated from typing import ClassVar from unittest.mock import Mock import pytest from psygnal import ( Signal, SignalGroup, SignalGroupDescriptor, ) @pytest.mark.parametrize( "type_", [ "dataclass", "attrs", "pydantic", "msgspec", ], ) def test_alias_parameters(type_: str) -> None: root_aliases = {"b": None, "bb": None} class MyGroup(SignalGroup, signal_aliases=root_aliases): b = Signal(str, str) bb = Signal(str, str) foo_options = {"signal_aliases": {"_b": None}} bar_options = { "signal_aliases": lambda x: None if x.startswith("_") else f"{x}_changed" } baz_options = {"signal_aliases": {"a": "a_changed", "_b": "b_changed"}} baz2_options = { "signal_group_class": MyGroup, "signal_aliases": {"aa": "a", "bb": "b"}, } if type_ == "dataclass": from dataclasses import dataclass, field @dataclass class Foo: events: ClassVar = SignalGroupDescriptor(**foo_options) a: int _b: str @dataclass class Bar: events: ClassVar = SignalGroupDescriptor(**bar_options) a: int _b: str @dataclass class Baz: events: ClassVar = SignalGroupDescriptor(**baz_options) a: int _b: str = field(default="b") @property def b(self) -> str: return self._b @b.setter def b(self, value: str): self._b = value @dataclass class Baz2: events: ClassVar = SignalGroupDescriptor(**baz2_options) a: int aa: int b: str bb: str elif type_ == "attrs": from attrs import define, field @define class Foo: events: ClassVar = SignalGroupDescriptor(**foo_options) a: int _b: str = field(alias="_b") @define class Bar: events: ClassVar = SignalGroupDescriptor(**bar_options) a: int _b: str = field(alias="_b") @define class Baz: events: ClassVar = SignalGroupDescriptor(**baz_options) a: int _b: str = field(alias="_b", default="b") @property def b(self) -> str: return self._b @b.setter def b(self, value: str): self._b = value @define class Baz2: events: ClassVar = SignalGroupDescriptor(**baz2_options) a: int aa: int b: str bb: str elif type_ == "pydantic": pytest.importorskip("pydantic", minversion="2") from pydantic import BaseModel class Foo(BaseModel): events: ClassVar = SignalGroupDescriptor(**foo_options) a: int _b: str # not a field anyway class Bar(BaseModel): events: ClassVar = SignalGroupDescriptor(**bar_options) a: int _b: str # not a field anyway class Baz(BaseModel): events: ClassVar = SignalGroupDescriptor(**baz_options) a: int _b: str = "b" # not defining a field, signal will not be created @property def b(self) -> str: return self._b @b.setter def b(self, value: str): self._b = value class Baz2(BaseModel): events: ClassVar = SignalGroupDescriptor(**baz2_options) a: int aa: int b: str bb: str elif type_ == "msgspec": msgspec = pytest.importorskip("msgspec") class Foo(msgspec.Struct): # type: ignore events: ClassVar = SignalGroupDescriptor(**foo_options) a: int _b: str class Bar(msgspec.Struct): # type: ignore events: ClassVar = SignalGroupDescriptor(**bar_options) a: int _b: str class Baz(msgspec.Struct): # type: ignore events: ClassVar = SignalGroupDescriptor(**baz_options) a: int _b: str = "b" @property def b(self) -> str: return self._b @b.setter def b(self, value: str): self._b = value class Baz2(msgspec.Struct): # type: ignore events: ClassVar = SignalGroupDescriptor(**baz2_options) a: int aa: int b: str bb: str # Instantiate objects foo = Foo(a=1, _b="b") bar = Bar(a=1, _b="b") baz = Baz(a=1) baz2 = Baz2(a=1, aa=2, b="b", bb="bb") # Check signals assert set(foo.events) == {"a"} assert hasattr(foo.events, "_psygnal_aliases") assert foo.events._psygnal_aliases == foo_options["signal_aliases"] assert set(bar.events) == {"a_changed"} assert hasattr(bar.events, "_psygnal_aliases") if type_.startswith("pydantic"): assert bar.events._psygnal_aliases == {"a": "a_changed"} else: assert bar.events._psygnal_aliases == {"a": "a_changed", "_b": None} if type_.startswith("pydantic"): assert set(baz.events) == {"a_changed"} else: assert set(baz.events) == {"a_changed", "b_changed"} assert hasattr(baz.events, "_psygnal_aliases") assert baz.events._psygnal_aliases == baz_options["signal_aliases"] # with pytest.warns(UserWarning, match=r"Skip signal \'a\', was already created"): with pytest.warns(UserWarning) as record: assert set(baz2.events) == {"a", "b", "bb"} assert len(record) == 2 assert record[0].message.args[0].startswith("Skip signal 'a', was already created") assert record[1].message.args[0].startswith("Skip signal 'b', was already defined") assert hasattr(baz.events, "_psygnal_aliases") assert baz2.events._psygnal_aliases == { **root_aliases, **baz2_options["signal_aliases"], } mock = Mock() foo.events.a.connect(mock) bar.events.a_changed.connect(mock) baz.events.a_changed.connect(mock) if not type_.startswith("pydantic"): baz.events.b_changed.connect(mock) baz2.events.a.connect(mock) baz2.events.b.connect(mock) baz2.events.bb.connect(mock) # Foo foo.a = 1 mock.assert_not_called() foo.a = 2 mock.assert_called_once_with(2, 1) mock.reset_mock() foo._b = "b" foo._b = "c" mock.assert_not_called() # Bar bar.a = 1 mock.assert_not_called() bar.a = 2 mock.assert_called_once_with(2, 1) mock.reset_mock() bar._b = "b" bar._b = "c" mock.assert_not_called() # Baz baz.a = 1 mock.assert_not_called() baz.a = 2 mock.assert_called_once_with(2, 1) mock.reset_mock() # Baz2 baz2.a = 1 baz2.aa = 2 mock.assert_not_called() baz2.a = 2 mock.assert_called_once_with(2, 1) mock.reset_mock() baz2.aa = 3 mock.assert_called_once_with(3, 2) mock.reset_mock() baz2.b = "b" mock.assert_not_called() baz2.b = "c" mock.assert_not_called() baz2.bb = "bb" mock.assert_not_called() baz2.bb = "bbb" mock.assert_called_once_with("bbb", "bb") mock.reset_mock() # pydantic v1 does not support properties if type_ == "pydantic_v1": return baz.b = "b" mock.assert_not_called() baz.b = "c" if not type_.startswith("pydantic"): mock.assert_called_once_with("c", "b") def test_direct_signal_group() -> None: class FooSignalGroup(SignalGroup, signal_aliases={"e": None}): a = Signal(int, int) b_changed = Signal(float, float) c = Signal(str, str) d = Signal(str, str) e = Signal(str, str) class Foo: events: ClassVar = SignalGroupDescriptor( signal_group_class=FooSignalGroup, collect_fields=False, signal_aliases={ "b": "b_changed", "c": None, "_c": "c", "_e": "e", }, ) a: int b: float _c: str _d: str _e: int def __init__( self, a: int = 1, b: float = 2.0, c: str = "c", d: str = "d", _e: int = 5, ): self.a = a self.b = b self.c = c self.d = d self._e = _e @property def c(self) -> str: return self._c @c.setter def c(self, value: str): self._c = value @property def d(self) -> str: return self._d.lower() @d.setter def d(self, value: str): self._d = value foo = Foo() assert hasattr(foo.events, "_psygnal_aliases") assert foo.events._psygnal_aliases == { "b": "b_changed", "_c": "c", "c": None, "e": None, "_e": "e", } mock = Mock() foo.events.a.connect(mock) foo.events.b_changed.connect(mock) foo.events.c.connect(mock) foo.events.d.connect(mock) foo.events.e.connect(mock) foo.events.e.emit("f", "e") mock.assert_called_once_with("f", "e") mock.reset_mock() foo.a = 2 mock.assert_called_once_with(2, 1) mock.reset_mock() foo.b = 3.0 mock.assert_called_once_with(3.0, 2.0) mock.reset_mock() foo.c = "c" mock.assert_not_called() foo.c = "cc" mock.assert_called_once_with("cc", "c") mock.reset_mock() foo._c = "ccc" mock.assert_called_once_with("ccc", "cc") mock.reset_mock() foo.d = "D" mock.assert_not_called() foo.d = "DD" mock.assert_called_once_with("dd", "d") mock.reset_mock() foo._e = 5 mock.assert_not_called() foo._e = 6 mock.assert_called_once_with(6, 5) mock.reset_mock() def test_bad_siggroup_descriptor_init(): with pytest.raises( TypeError, match="'signal_group_class' must be a subclass of SignalGroup", ): SignalGroupDescriptor(signal_group_class=type) # type: ignore with pytest.raises( ValueError, match=r"Cannot use SignalGroup with `collect_fields=False`.", ): SignalGroupDescriptor(collect_fields=False) with pytest.raises( ValueError, match="Cannot use a Callable for `signal_aliases` with `collect_fields=False`", ): SignalGroupDescriptor( collect_fields=False, signal_group_class=type("MyGroup", (SignalGroup,), {"x": Signal()}), signal_aliases=lambda x: None, ) psygnal-0.15.0/tests/test_group_descriptor.py0000644000000000000000000001613415073705675016371 0ustar00from contextlib import nullcontext from dataclasses import dataclass from typing import Any, ClassVar from unittest.mock import Mock, patch import pytest from psygnal import ( Signal, SignalGroup, SignalGroupDescriptor, _compiled, _group_descriptor, ) class MyGroup(SignalGroup): sig = Signal() @pytest.mark.parametrize("type_", ["dataclass", "pydantic", "attrs", "msgspec"]) def test_descriptor_inherits(type_: str) -> None: if type_ == "dataclass": from dataclasses import dataclass @dataclass class Base: a: int events: ClassVar = SignalGroupDescriptor() @dataclass class Foo(Base): b: str @dataclass class Bar(Foo): c: float elif type_ == "pydantic": pytest.importorskip("pydantic") from pydantic import BaseModel class Base(BaseModel): a: int events: ClassVar = SignalGroupDescriptor() class Foo(Base): b: str class Bar(Foo): c: float elif type_ == "attrs": from attrs import define @define class Base: a: int events: ClassVar = SignalGroupDescriptor() @define class Foo(Base): b: str @define class Bar(Foo): c: float elif type_ == "msgspec": msgspec = pytest.importorskip("msgspec") class Base(msgspec.Struct): # type: ignore a: int events: ClassVar = SignalGroupDescriptor() class Foo(Base): b: str class Bar(Foo): c: float assert Bar.events is Base.events with patch.object( _group_descriptor, "evented_setattr", wraps=_group_descriptor.evented_setattr ) as mock_decorator: base = Base(a=1) foo = Foo(a=1, b="2") bar = Bar(a=1, b="2", c=3.0) bar2 = Bar(a=1, b="2", c=3.0) # the patching of __setattr__ should only happen once # and it will happen only on the first access of .events mock_decorator.assert_not_called() assert set(base.events) == {"a"} assert set(foo.events) == {"a", "b"} assert set(bar.events) == {"a", "b", "c"} assert set(bar2.events) == {"a", "b", "c"} if not _compiled: # can't patch otherwise assert mock_decorator.call_count == 1 mock = Mock() foo.events.a.connect(mock) # base doesn't affect subclass base.events.a.emit(1) mock.assert_not_called() # subclass doesn't affect superclass bar.events.a.emit(1) mock.assert_not_called() foo.events.a.emit(1) mock.assert_called_once_with(1) @pytest.mark.parametrize("patch_setattr", [True, False]) def test_no_patching(patch_setattr: bool) -> None: """Test patch_setattr=False doesn't patch the class""" # sourcery skip: extract-duplicate-method @dataclass class Foo: a: int _events: ClassVar = SignalGroupDescriptor(patch_setattr=patch_setattr) with patch.object( _group_descriptor, "evented_setattr", wraps=_group_descriptor.evented_setattr ) as mock_decorator: foo = Foo(a=1) _ = foo._events if not _compiled: # can't patch otherwise assert mock_decorator.call_count == int(patch_setattr) assert _group_descriptor.is_evented(Foo.__setattr__) == patch_setattr mock = Mock() foo._events.a.connect(mock) foo.a = 2 if patch_setattr: mock.assert_called_once_with(2, 1) else: mock.assert_not_called() def test_direct_patching() -> None: """Test directly using evented_setattr on a class""" mock1 = Mock() @dataclass class Foo: a: int _events: ClassVar = SignalGroupDescriptor(patch_setattr=False) @_group_descriptor.evented_setattr("_events") def __setattr__(self, __name: str, __value: Any) -> None: old = getattr(self, __name, None) mock1(__name, __value, old) super().__setattr__(__name, __value) assert _group_descriptor.is_evented(Foo.__setattr__) # patch again ... this should NOT cause a double event emission. Foo.__setattr__ = _group_descriptor.evented_setattr("_events", Foo.__setattr__) foo = Foo(a=1) mock = Mock() foo._events.a.connect(mock) foo.a = 2 mock.assert_called_once_with(2, 1) # confirm no double event emission mock1.assert_called_with("a", 2, 1) def test_no_getattr_on_non_evented_fields() -> None: """Make sure that we're not accidentally calling getattr on non-evented fields.""" a_mock = Mock() b_mock = Mock() @dataclass class Foo: a: int events: ClassVar = SignalGroupDescriptor() @property def b(self) -> int: b_mock(self._b) return self._b @b.setter def b(self, value: int) -> None: self._b = value foo = Foo(a=1) foo.events.a.connect(a_mock) foo.a = 2 a_mock.assert_called_once_with(2, 1) foo.b = 1 b_mock.assert_not_called() # getter shouldn't have been called assert foo.b == 1 b_mock.assert_called_once_with(1) # getter should have been called only once def test_evented_field_connect_setattr() -> None: """Test that using connect_setattr""" @dataclass class Foo: a: int events: ClassVar = SignalGroupDescriptor() class Bar: x = 1 y = 1 foo = Foo(a=1) bar = Bar() foo.events.a.connect_setattr(bar, "x") foo.events.a.connect_setattr(bar, "y", maxargs=None) foo.events.a.emit(2, 1) assert bar.x == 2 # this is likely the desired outcome # this is a bit of a gotcha, but it's the expected behavior # when using connect_setattr with maxargs=None # remove this test if/when we change maxargs to default to 1 on SignalInstance assert bar.y == (2, 1) # type: ignore @pytest.mark.parametrize("collect", [True, False]) @pytest.mark.parametrize("klass", [None, SignalGroup, MyGroup]) def test_collect_fields(collect: bool, klass: type[SignalGroup] | None) -> None: signal_class = klass or SignalGroup should_fail_def = signal_class is SignalGroup and collect is False ctx = pytest.raises(ValueError) if should_fail_def else nullcontext() with ctx: @dataclass class Foo: events: ClassVar = SignalGroupDescriptor( warn_on_no_fields=False, signal_group_class=klass, collect_fields=collect, ) a: int = 1 if should_fail_def: return @dataclass class Bar(Foo): b: float = 2.0 foo = Foo() bar = Bar() assert issubclass(type(foo.events), signal_class) if collect: assert type(foo.events) is not signal_class assert "a" in foo.events assert "a" in bar.events assert "b" in bar.events else: assert type(foo.events) is signal_class assert "a" not in foo.events assert "a" not in bar.events assert "b" not in bar.events if signal_class is MyGroup: assert "sig" in foo.events assert "sig" in bar.events psygnal-0.15.0/tests/test_path_coverage.py0000644000000000000000000000000015073705675015567 0ustar00psygnal-0.15.0/tests/test_path_step.py0000644000000000000000000000736015073705675014767 0ustar00"""Tests to cover missing lines in path/emission functionality.""" from inspect import Signature import pytest from psygnal import EmissionInfo, PathStep, Signal, SignalGroup, SignalInstance def test_pathstep_validation(): """Test PathStep validation error conditions.""" # Test empty PathStep (should fail) with pytest.raises(ValueError, match="exactly one of attr, index, or key"): PathStep() # Test multiple fields set (should fail) with pytest.raises(ValueError, match="exactly one of attr, index, or key"): PathStep(attr="test", index=1) def test_pathstep_repr(): """Test PathStep repr formatting, including long key truncation.""" # Test attr assert repr(PathStep(attr="test")) == ".test" # Test index assert repr(PathStep(index=5)) == "[5]" # Test short key assert repr(PathStep(key="short")) == "['short']" # Test long key truncation class CrazyHashable: """A class with a long __repr__.""" def __repr__(self): return "a" * 100 ps = PathStep(key=CrazyHashable()) result = repr(ps) assert "..." in result assert len(result) <= 25 # should be truncated def test_emission_info_path_validation(): """Test EmissionInfo path validation.""" # Create a SignalInstance properly instance = SignalInstance(Signature()) # Valid paths should work EmissionInfo(instance, (1,), (PathStep(attr="test"),)) EmissionInfo(instance, (1,), (PathStep(index=0), PathStep(key="key"))) # Invalid path types should fail with pytest.raises(TypeError): EmissionInfo(instance, (1,), (object(),)) # type: ignore def test_signal_relay_no_emitter(): """Test SignalRelay when no current emitter.""" class TestGroup(SignalGroup): test_signal = Signal(int) group = TestGroup() # Test that _slot_relay returns early when no current emitter # This should not raise an error and should not emit anything relay = group.all relay._slot_relay(1, 2, 3) # Should return early due to no emitter def test_signal_group_repr_without_instance(): """Test SignalGroup repr when instance is None.""" class TestGroup(SignalGroup): test_signal = Signal(int) # Create group without instance group = TestGroup() repr_str = repr(group) assert "TestGroup" in repr_str assert "instance at" not in repr_str # Should not have instance info def test_signal_group_repr_with_instance(): """Test SignalGroup repr when instance is not None.""" class TestGroup(SignalGroup): test_signal = Signal(int) class TestObj: def __init__(self): self.events = TestGroup(instance=self) # Create group with instance obj = TestObj() repr_str = repr(obj.events) assert "TestGroup" in repr_str assert "instance at" in repr_str # Should have instance info assert "TestObj" in repr_str def test_list_signal_instance_relocate_empty_args(): """Test ListSignalInstance _psygnal_relocate_info_ with empty args.""" from psygnal.containers._evented_list import ListSignalInstance # Create a signal instance list_sig = ListSignalInstance((int,)) # Test with empty args info = EmissionInfo(list_sig, ()) result = list_sig._psygnal_relocate_info_(info) assert result is info # Should return unchanged def test_dict_signal_instance_relocate_empty_args(): """Test DictSignalInstance _psygnal_relocate_info_ with empty args.""" from psygnal.containers._evented_dict import DictSignalInstance # Create a signal instance dict_sig = DictSignalInstance((str,)) # Test with empty args info = EmissionInfo(dict_sig, ()) result = dict_sig._psygnal_relocate_info_(info) assert result is info # Should return unchanged psygnal-0.15.0/tests/test_psygnal.py0000644000000000000000000007747215073705675014470 0ustar00import gc import os from contextlib import suppress from functools import partial, wraps from inspect import Signature from typing import Literal from unittest.mock import MagicMock, Mock, call import pytest import psygnal from psygnal import EmitLoopError, Signal, SignalInstance from psygnal._signal import ReemissionMode, ReemissionVal from psygnal._weak_callback import WeakCallback WINDOWS = os.name == "nt" COMPILED = psygnal._compiled def stupid_decorator(fun): def _fun(*args): fun(*args) _fun.__annotations__ = fun.__annotations__ _fun.__name__ = "f_no_arg" return _fun def good_decorator(fun): @wraps(fun) def _fun(*args): fun(*args) return _fun # fmt: off class Emitter: no_arg = Signal() one_int = Signal(int) two_int = Signal(int, int) str_int = Signal(str, int) no_check = Signal(str, check_nargs_on_connect=False, check_types_on_connect=False) class MyObj: def f_no_arg(self): ... def f_str_int_vararg(self, a: str, b: int, *c): ... def f_str_int_any(self, a: str, b: int, c): ... def f_str_int_kwarg(self, a: str, b: int, c=None): ... def f_str_int(self, a: str, b: int): ... def f_str_any(self, a: str, b): ... def f_str(self, a: str): ... def f_int(self, a: int): ... def f_any(self, a): ... def f_int_int(self, a: int, b: int): ... def f_str_str(self, a: str, b: str): ... def f_arg_kwarg(self, a, b=None): ... def f_vararg(self, *a): ... def f_vararg_varkwarg(self, *a, **b): ... def f_vararg_kwarg(self, *a, b=None): ... @stupid_decorator def f_int_decorated_stupid(self, a: int): ... @good_decorator def f_int_decorated_good(self, a: int): ... f_any_assigned = lambda self, a: None # noqa x: int = 0 def __setitem__(self, key: str, value: int): if key == "x": self.x = value def f_no_arg(): ... def f_str_int_vararg(a: str, b: int, *c): ... def f_str_int_any(a: str, b: int, c): ... def f_str_int_kwarg(a: str, b: int, c=None): ... def f_str_int(a: str, b: int): ... def f_str_any(a: str, b): ... def f_str(a: str): ... def f_int(a: int): ... def f_any(a): ... def f_int_int(a: int, b: int): ... def f_str_str(a: str, b: str): ... def f_arg_kwarg(a, b=None): ... def f_vararg(*a): ... def f_vararg_varkwarg(*a, **b): ... def f_vararg_kwarg(*a, b=None): ... class MyReceiver: expect_signal = None expect_sender = None expect_name = None def assert_sender(self, *a): assert Signal.current_emitter() is self.expect_signal assert self.expect_name in repr(Signal.current_emitter()) assert Signal.current_emitter().instance is self.expect_sender assert Signal.sender() is self.expect_sender assert Signal.current_emitter()._name is self.expect_name def assert_not_sender(self, *a): # just to make sure we're actually calling it assert Signal.current_emitter().instance is not self.expect_sender # fmt: on def test_basic_signal(): """standard Qt usage, as class attribute""" emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) emitter.one_int.emit(1) mock.assert_called_once_with(1) mock.reset_mock() # calling directly also works emitter.one_int(1) mock.assert_called_once_with(1) def test_emit_fast(): """Test emit_fast method.""" emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) emitter.one_int.emit_fast(1) mock.assert_called_once_with(1) mock.reset_mock() # calling directly also works emitter.one_int(1) mock.assert_called_once_with(1) mock.reset_mock() with emitter.one_int.blocked(): emitter.one_int.emit_fast(2) mock.assert_not_called() with emitter.one_int.paused(): emitter.one_int.emit_fast(3) mock.assert_not_called() emitter.one_int.emit_fast(4) mock.assert_has_calls([call(3), call(4)]) def test_emit_fast_errors(): emitter = Emitter() err = ValueError() @emitter.one_int.connect def boom(v: int) -> None: raise err import re error_re = re.compile( f"signal 'tests.test_psygnal.Emitter.one_int'.*{re.escape(__file__)}", re.DOTALL, ) with pytest.raises(EmitLoopError, match=error_re): emitter.one_int.emit_fast(42) def test_emit_fast_recursion_errors(): """Test emit_fast method.""" emitter = Emitter() emitter.one_int.emit_fast(1) @emitter.one_int.connect def callback() -> None: emitter.one_int.emit(2) with pytest.raises(RecursionError): emitter.one_int.emit_fast(3) emitter.one_int.disconnect(callback) @emitter.one_int.connect def callback() -> None: emitter.one_int.emit_fast(2) with pytest.raises(RecursionError): emitter.one_int.emit_fast(3) def test_decorator(): emitter = Emitter() err = ValueError() @emitter.one_int.connect def boom(v: int) -> None: raise err @emitter.one_int.connect(check_nargs=False) def bad_cb(a, b, c): ... import re error_re = re.compile( f"signal 'tests.test_psygnal.Emitter.one_int'.*{re.escape(__file__)}", re.DOTALL, ) with pytest.raises(EmitLoopError, match=error_re) as e: emitter.one_int.emit(42) assert e.value.__cause__ is err assert e.value.__context__ is err def test_misc(): emitter = Emitter() assert isinstance(Emitter.one_int, Signal) assert isinstance(emitter.one_int, SignalInstance) with pytest.raises(AttributeError): _ = emitter.one_int.asdf with pytest.raises(AttributeError): _ = emitter.one_int.asdf def test_getattr(): s = Signal() with pytest.raises(AttributeError): _ = s.not_a_thing def test_signature_provided(): s = Signal(Signature()) assert s.signature == Signature() with pytest.warns(UserWarning): s = Signal(Signature(), 1) def test_emit_checks(): emitter = Emitter() emitter.one_int.connect(f_no_arg) emitter.one_int.emit(check_nargs=False) emitter.one_int.emit() with pytest.raises(TypeError): emitter.one_int.emit(check_nargs=True) emitter.one_int.emit(1) emitter.one_int.emit(1, 2, check_nargs=False) emitter.one_int.emit(1, 2) with pytest.raises(TypeError): emitter.one_int.emit(1, 2, check_nargs=True) with pytest.raises(TypeError): emitter.one_int.emit("sdr", check_types=True) emitter.one_int.emit("sdr", check_types=False) def test_basic_signal_blocked(): """standard Qt usage, as class attribute""" emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) emitter.one_int.emit(1) mock.assert_called_once_with(1) mock.reset_mock() with emitter.one_int.blocked(): emitter.one_int.emit(1) mock.assert_not_called() def test_nested_signal_blocked(): """unblock signal on exit of the last context""" emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) mock.reset_mock() with emitter.one_int.blocked(): with emitter.one_int.blocked(): emitter.one_int.emit(1) emitter.one_int.emit(2) emitter.one_int.emit(3) mock.assert_called_once_with(3) @pytest.mark.parametrize("thread", [None, "main"]) def test_disconnect(thread: Literal[None, "main"]) -> None: emitter = Emitter() mock = MagicMock() with pytest.raises(ValueError) as e: emitter.one_int.disconnect(mock, missing_ok=False) assert "slot is not connected" in str(e) emitter.one_int.disconnect(mock) emitter.one_int.connect(mock, thread=thread) assert len(emitter.one_int) == 1 if thread is None: emitter.one_int.emit(1) mock.assert_called_once_with(1) mock.reset_mock() emitter.one_int.disconnect(mock) emitter.one_int.emit(1) mock.assert_not_called() assert len(emitter.one_int) == 0 @pytest.mark.parametrize( "type_", [ "function", "lambda", "method", "partial_method", "toolz_function", "toolz_method", "partial_method_kwarg", "partial_method_kwarg_bad", "setattr", "setitem", ], ) def test_slot_types(type_: str) -> None: emitter = Emitter() signal = emitter.one_int assert len(signal) == 0 obj = MyObj() if type_ == "setattr": with pytest.warns(FutureWarning, match="The default value of maxargs"): signal.connect_setattr(obj, "x") elif type_ == "setitem": with pytest.warns(FutureWarning, match="The default value of maxargs"): signal.connect_setitem(obj, "x") elif type_ == "function": signal.connect(f_int) elif type_ == "lambda": signal.connect(lambda x: None) elif type_ == "method": signal.connect(obj.f_int) elif type_ == "partial_method": signal.connect(partial(obj.f_int_int, 2)) elif type_ == "toolz_function": toolz = pytest.importorskip("toolz") signal.connect(toolz.curry(f_int_int, 2)) elif type_ == "toolz_method": toolz = pytest.importorskip("toolz") signal.connect(toolz.curry(obj.f_int_int, 2)) elif type_ == "partial_method_kwarg": signal.connect(partial(obj.f_int_int, b=2)) elif type_ == "partial_method_kwarg_bad": with pytest.raises(ValueError, match=r".*prefer using positional args"): signal.connect(partial(obj.f_int_int, a=2)) return assert len(signal) == 1 stored_slot = signal._slots[-1] assert isinstance(stored_slot, WeakCallback) assert stored_slot == stored_slot with pytest.raises(TypeError): emitter.one_int.connect("not a callable") # type: ignore def test_basic_signal_with_sender_receiver(): """standard Qt usage, as class attribute""" emitter = Emitter() receiver = MyReceiver() receiver.expect_sender = emitter receiver.expect_signal = emitter.one_int receiver.expect_name = "one_int" assert Signal.current_emitter() is None emitter.one_int.connect(receiver.assert_sender) emitter.one_int.emit(1) # back to none after the call is over. assert Signal.current_emitter() is None emitter.one_int.disconnect() # sanity check... to make sure that methods are in fact being called. emitter.one_int.connect(receiver.assert_not_sender) with pytest.raises(EmitLoopError) as e: emitter.one_int.emit(1) assert isinstance(e.value.__cause__, AssertionError) assert isinstance(e.value.__context__, AssertionError) def test_basic_signal_with_sender_nonreceiver(): """standard Qt usage, as class attribute""" emitter = Emitter() nr = MyObj() emitter.one_int.connect(nr.f_no_arg) emitter.one_int.connect(nr.f_int) emitter.one_int.connect(nr.f_vararg_varkwarg) emitter.one_int.emit(1) # emitter.one_int.connect(nr.two_int) def test_signal_instance(): """make a signal instance without a class""" signal = SignalInstance((int,)) mock = MagicMock() signal.connect(mock) signal.emit(1) mock.assert_called_once_with(1) signal = SignalInstance() mock = MagicMock() signal.connect(mock) signal.emit() mock.assert_called_once_with() @pytest.mark.parametrize( "slot", [ "f_no_arg", "f_int_decorated_stupid", "f_int_decorated_good", "f_any_assigned", "partial", "toolz_curry", ], ) def test_weakref(slot): """Test that a connected method doesn't hold strong ref.""" emitter = Emitter() obj = MyObj() assert len(emitter.one_int) == 0 if slot == "partial": emitter.one_int.connect(partial(obj.f_int_int, 1)) elif slot == "toolz_curry": toolz = pytest.importorskip("toolz") emitter.one_int.connect(toolz.curry(obj.f_int_int, 1)) else: emitter.one_int.connect(getattr(obj, slot)) assert len(emitter.one_int) == 1 emitter.one_int.emit(1) assert len(emitter.one_int) == 1 del obj gc.collect() emitter.one_int.emit(1) # this should trigger deletion assert len(emitter.one_int) == 0 @pytest.mark.parametrize( "slot", [ "f_no_arg", "f_int_decorated_stupid", "f_int_decorated_good", "f_any_assigned", "partial", ], ) def test_group_weakref(slot: str) -> None: """Test that a connected method doesn't hold strong ref.""" from psygnal import SignalGroup class MyGroup(SignalGroup): sig1 = Signal(int) group = MyGroup() obj = MyObj() assert len(group.sig1) == 0 # but the group itself doesn't have any assert len(group._psygnal_relay) == 0 # connecting something to the group adds to the group connections group.all.connect( partial(obj.f_int_int, 1) if slot == "partial" else getattr(obj, slot) ) assert len(group.sig1) == 1 assert len(group._psygnal_relay) == 1 group.sig1.emit(1) assert len(group.sig1) == 1 del obj gc.collect() group.sig1.emit(1) # this should trigger deletion, so would emitter.emit() assert len(group.sig1) == 0 # NOTE! this is 0 not 1, because the relay is also gone assert len(group._psygnal_relay) == 0 # it's been cleaned up # def test_norm_slot(): # r = MyObj() # normed1 = _normalize_slot(r.f_any) # normed2 = _normalize_slot(normed1) # normed3 = _normalize_slot((r, "f_any", None)) # normed4 = _normalize_slot((weakref.ref(r), "f_any", None)) # assert normed1 == (weakref.ref(r), "f_any", None) # assert normed1 == normed2 == normed3 == normed4 # assert _normalize_slot(f_any) == f_any ALL = {n for n, f in locals().items() if callable(f) and n.startswith("f_")} COUNT_INCOMPATIBLE = { "no_arg": ALL - {"f_no_arg", "f_vararg", "f_vararg_varkwarg", "f_vararg_kwarg"}, "one_int": { "f_int_int", "f_str_any", "f_str_int_any", "f_str_int_kwarg", "f_str_int_vararg", "f_str_int", "f_str_str", }, "str_int": {"f_str_int_any"}, } SIG_INCOMPATIBLE = { "no_arg": {"f_int_int", "f_int", "f_str_int_any", "f_str_str"}, "one_int": { "f_int_int", "f_str_int_any", "f_str_int_vararg", "f_str_str", "f_str", }, "str_int": {"f_int_int", "f_int", "f_str_int_any", "f_str_str"}, } @pytest.mark.parametrize("typed", ["typed", "untyped"]) @pytest.mark.parametrize("func_name", ALL) @pytest.mark.parametrize("sig_name", ["no_arg", "one_int", "str_int"]) @pytest.mark.parametrize("mode", ["func", "meth", "partial"]) def test_connect_validation(func_name, sig_name, mode, typed): from functools import partial if mode == "meth": func = getattr(MyObj(), func_name) elif mode == "partial": func = partial(globals()[func_name]) else: func = globals()[func_name] e = Emitter() check_types = typed == "typed" signal: SignalInstance = getattr(e, sig_name) bad_count = COUNT_INCOMPATIBLE[sig_name] bad_sig = SIG_INCOMPATIBLE[sig_name] if func_name in bad_count or (check_types and func_name in bad_sig): with pytest.raises(ValueError) as er: signal.connect(func, check_types=check_types) assert "Accepted signature:" in str(er) return signal.connect(func, check_types=check_types) args = (p.annotation() for p in signal.signature.parameters.values()) signal.emit(*args) def test_connect_lambdas(): e = Emitter() assert len(e.two_int._slots) == 0 e.two_int.connect(lambda: None) e.two_int.connect(lambda x: None) assert len(e.two_int._slots) == 2 e.two_int.connect(lambda x, y: None) e.two_int.connect(lambda x, y, z=None: None) assert len(e.two_int._slots) == 4 e.two_int.connect(lambda x, y, *z: None) e.two_int.connect(lambda *z: None) assert len(e.two_int._slots) == 6 e.two_int.connect(lambda *z, **k: None) assert len(e.two_int._slots) == 7 with pytest.raises(ValueError): e.two_int.connect(lambda x, y, z: None) def test_mock_connect(): e = Emitter() e.one_int.connect(MagicMock()) # fmt: off class TypeA: ... class TypeB(TypeA): ... class TypeC(TypeB): ... class Rcv: def methodA(self, obj: TypeA): ... def methodA_ref(self, obj: 'TypeA'): ... def methodB(self, obj: TypeB): ... def methodB_ref(self, obj: 'TypeB'): ... def methodOptB(self, obj: TypeB | None): ... def methodOptB_ref(self, obj: 'TypeB | None'): ... def methodC(self, obj: TypeC): ... def methodC_ref(self, obj: 'TypeC'): ... class Emt: signal = Signal(TypeB) # fmt: on def test_forward_refs_type_checking(): e = Emt() r = Rcv() e.signal.connect(r.methodB, check_types=True) e.signal.connect(r.methodB_ref, check_types=True) e.signal.connect(r.methodOptB, check_types=True) e.signal.connect(r.methodOptB_ref, check_types=True) e.signal.connect(r.methodC, check_types=True) e.signal.connect(r.methodC_ref, check_types=True) # signal is emitting a TypeB, but method is expecting a typeA assert not issubclass(TypeA, TypeB) # typeA is not a TypeB, so we get an error with pytest.raises(ValueError): e.signal.connect(r.methodA, check_types=True) with pytest.raises(ValueError): e.signal.connect(r.methodA_ref, check_types=True) def test_checking_off(): e = Emitter() # the no_check signal was instantiated with check_[nargs/types] = False @e.no_check.connect def bad_in_many_ways(x: int, y, z): ... def test_keyword_only_not_allowed(): e = Emitter() def f(a: int, *, b: int): ... with pytest.raises(ValueError) as er: e.two_int.connect(f) assert "Unsupported KEYWORD_ONLY parameters in signature" in str(er) def test_unique_connections(): e = Emitter() assert len(e.one_int._slots) == 0 e.one_int.connect(f_no_arg, unique=True) assert len(e.one_int._slots) == 1 e.one_int.connect(f_no_arg, unique=True) assert len(e.one_int._slots) == 1 with pytest.raises(ValueError): e.one_int.connect(f_no_arg, unique="raise") assert len(e.one_int._slots) == 1 e.one_int.connect(f_no_arg) assert len(e.one_int._slots) == 2 def test_sig_unavailable(): """In some cases, signature.inspect() fails on a callable, (many builtins). We should still connect, but with a warning. """ e = Emitter() e.one_int.connect(vars, check_nargs=False) # no warning with pytest.warns(UserWarning): e.one_int.connect(vars) # we've special cased print... due to frequency of use. e.one_int.connect(print) # no warning def test_pause(): """Test that we can pause, and resume emission of (possibly reduced) args.""" emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) emitter.one_int.emit(1) mock.assert_called_once_with(1) mock.reset_mock() emitter.one_int.pause() emitter.one_int.emit(1) emitter.one_int.emit(2) emitter.one_int.emit(3) mock.assert_not_called() emitter.one_int.resume() mock.assert_has_calls([call(1), call(2), call(3)]) mock.reset_mock() with emitter.one_int.paused(lambda a, b: (a[0].union(set(b)),), (set(),)): emitter.one_int.emit(1) emitter.one_int.emit(2) emitter.one_int.emit(3) mock.assert_called_once_with({1, 2, 3}) mock.reset_mock() emitter.one_int.pause() emitter.one_int.resume() mock.assert_not_called() def test_resume_with_initial(): emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) with emitter.one_int.paused(lambda a, b: (a[0] + b[0],)): emitter.one_int.emit(1) emitter.one_int.emit(2) emitter.one_int.emit(3) mock.assert_called_once_with(6) mock.reset_mock() with emitter.one_int.paused(lambda a, b: (a[0] + b[0],), (20,)): emitter.one_int.emit(1) emitter.one_int.emit(2) emitter.one_int.emit(3) mock.assert_called_once_with(26) def test_nested_pause(): emitter = Emitter() mock = MagicMock() emitter.one_int.connect(mock) with emitter.one_int.paused(): emitter.one_int.emit(1) emitter.one_int.emit(2) with emitter.one_int.paused(): emitter.one_int.emit(3) emitter.one_int.emit(4) emitter.one_int.emit(5) mock.assert_has_calls([call(i) for i in (1, 2, 3, 4, 5)]) def test_signals_on_unhashables(): class Emitter(dict): signal = Signal(int) e = Emitter() e.signal.connect(lambda x: print(x)) e.signal.emit(1) def test_property_connect(): class A: def __init__(self): self.li = [] @property def x(self): return self.li @x.setter def x(self, value): self.li.append(value) a = A() emitter = Emitter() emitter.one_int.connect_setattr(a, "x", maxargs=1) assert len(emitter.one_int) == 1 with pytest.warns(FutureWarning, match="The default value of maxargs"): emitter.two_int.connect_setattr(a, "x") assert len(emitter.two_int) == 1 emitter.one_int.emit(1) assert a.li == [1] emitter.two_int.emit(1, 1) assert a.li == [1, (1, 1)] emitter.two_int.disconnect_setattr(a, "x") assert len(emitter.two_int) == 0 with pytest.raises(ValueError): emitter.two_int.disconnect_setattr(a, "x", missing_ok=False) emitter.two_int.disconnect_setattr(a, "x") s = emitter.two_int.connect_setattr(a, "x", maxargs=1) emitter.two_int.emit(2, 3) assert a.li == [1, (1, 1), 2] emitter.two_int.disconnect(s, missing_ok=False) with pytest.raises(AttributeError): emitter.one_int.connect_setattr(a, "y", maxargs=None) def test_connect_setitem(): class T: sig = Signal(int) class SupportsItem: def __init__(self) -> None: self._dict = {} def __setitem__(self, key, value): self._dict[key] = value t = T() my_obj = SupportsItem() with pytest.warns(FutureWarning, match="The default value of maxargs"): t.sig.connect_setitem(my_obj, "x") t.sig.emit(5) assert my_obj._dict == {"x": 5} t.sig.disconnect_setitem(my_obj, "x") t.sig.emit(7) assert my_obj._dict == {"x": 5} obj = object() with pytest.raises(TypeError, match="does not support __setitem__"): t.sig.connect_setitem(obj, "x", maxargs=1) with pytest.raises(TypeError): t.sig.disconnect_setitem(obj, "x", missing_ok=False) def test_repr_not_used(): """Test that we don't use repr() or __call__ to check signature.""" mock = MagicMock() class T: def __repr__(self): mock() return "" def __call__(self): mock() t = T() sig = SignalInstance() sig.connect(t) mock.assert_not_called() # b.signal2.emit will warn that compiled SignalInstances cannot be weakly referenced @pytest.mark.filterwarnings("ignore:failed to create weakref:UserWarning") def test_signal_emit_as_slot(): class A: signal1 = Signal(int) class B: signal2 = Signal(int) mock = Mock() a = A() b = B() a.signal1.connect(b.signal2.emit) b.signal2.connect(mock) a.signal1.emit(1) mock.assert_called_once_with(1) mock.reset_mock() a.signal1.disconnect(b.signal2.emit) a.signal1.connect(b.signal2) # you can also just connect the signal instance a.signal1.emit(2) mock.assert_called_once_with(2) def test_emit_loop_exceptions(): emitter = Emitter() mock1 = Mock(side_effect=ValueError("Bad callback!")) mock2 = Mock() emitter.one_int.connect(mock1) emitter.one_int.connect(mock2) with pytest.raises(EmitLoopError): emitter.one_int.emit(1) mock1.assert_called_once_with(1) mock1.reset_mock() mock2.assert_not_called() with suppress(EmitLoopError): emitter.one_int.emit(2) mock1.assert_called_once_with(2) mock1.assert_called_once_with(2) @pytest.mark.parametrize( "slot", [ "f_no_arg", "f_int_decorated_stupid", "f_int_decorated_good", "f_any_assigned", "partial", "partial_kwargs", "partial", ], ) def test_weakref_disconnect(slot): """Test that a connected method doesn't hold strong ref.""" emitter = Emitter() obj = MyObj() assert len(emitter.one_int) == 0 if slot == "partial": cb = partial(obj.f_int_int, 1) elif slot == "partial_kwargs": cb = partial(obj.f_int_int, b=1) else: cb = getattr(obj, slot) emitter.one_int.connect(cb) assert len(emitter.one_int) == 1 emitter.one_int.emit(1) assert len(emitter.one_int) == 1 emitter.one_int.disconnect(cb) assert len(emitter.one_int) == 0 def test_queued_connections(): from threading import Thread, current_thread from psygnal import emit_queued this_thread = current_thread() emitter = Emitter() # function to run in another thread def _run(): emit_queued() emitter.one_int.emit(2) other_thread = Thread(target=_run) this_thread_mock = Mock() other_thread_mock = Mock() any_thread_mock = Mock() # mock1 wants to be called in this thread @emitter.one_int.connect(thread=this_thread) def cb1(arg): this_thread_mock(arg, current_thread()) # mock2 wants to be called in other_thread @emitter.one_int.connect(thread=other_thread) def cb2(arg): other_thread_mock(arg, current_thread()) # mock3 wants to be called in whatever thread the emitter is in @emitter.one_int.connect def cb3(arg): any_thread_mock(arg, current_thread()) # emit in this thread emitter.one_int.emit(1) this_thread_mock.assert_called_once_with(1, this_thread) # other_thread_mock not called because it's waiting for other_thread other_thread_mock.assert_not_called() # any_thread_mock called because it's waiting for any thread any_thread_mock.assert_called_once_with(1, this_thread) # Now we run `_run` in other_thread this_thread_mock.reset_mock() any_thread_mock.reset_mock() other_thread.start() other_thread.join() # now mock2 should be called TWICE. Once for the .emit(1) queued from this thread, # and once for the .emit(2) in other_thread other_thread_mock.assert_has_calls([call(1, other_thread), call(2, other_thread)]) # stuff queued for any_thread_mock should have also been called any_thread_mock.assert_called_once_with(2, other_thread) # stuff queued for this thread should NOT have been called this_thread_mock.assert_not_called() # ... until we call emit_queued() from this thread emit_queued() this_thread_mock.assert_called_once_with(2, this_thread) def test_deepcopy(): from copy import deepcopy mock = Mock() class T: sig = Signal() t = T() @t.sig.connect def f(): mock() t.sig.emit() mock.assert_called_once() mock.reset_mock() x = deepcopy(t) assert x is not t x.sig.emit() mock.assert_called_once() mock2 = Mock() class Foo: def method(self): mock2() foo = Foo() t.sig.connect(foo.method) t.sig.emit() mock2.assert_called_once() mock2.reset_mock() with pytest.warns(UserWarning, match="does not copy connected weakly referenced"): x2 = deepcopy(t) x2.sig.emit() mock2.assert_not_called() class T: sig = Signal() mock = Mock() def f(): return mock() def test_pickle(): import pickle t = T() t.sig.connect(f) _dump = pickle.dumps(t) x = pickle.loads(_dump) x.sig.emit() mock.assert_called_once() @pytest.mark.parametrize("strategy", [ReemissionMode.QUEUED, ReemissionMode.IMMEDIATE]) def test_recursion_error(strategy: ReemissionMode) -> None: s = SignalInstance(reemission=strategy) @s.connect def callback() -> None: s.emit() with pytest.raises(RecursionError): s.emit() @pytest.mark.parametrize( "strategy", [ReemissionMode.QUEUED, ReemissionMode.IMMEDIATE, ReemissionMode.LATEST] ) def test_callback_order(strategy: ReemissionMode) -> None: sig = SignalInstance((int,), reemission=strategy) a = [] def cb1(value: int) -> None: a.append(value) if value == 1: sig.emit(2) def cb2(value: int) -> None: a.append(value * 10) if value == 2: sig.emit(3) def cb3(value: int) -> None: a.append(value * 100) sig.connect(cb1) sig.connect(cb2) sig.connect(cb3) sig.emit(1) if strategy == ReemissionMode.IMMEDIATE: # nested emission events immediately trigger the next nested level # before returning to process the remainder of the current loop assert a == [1, 2, 20, 3, 30, 300, 200, 10, 100] elif strategy == ReemissionMode.LATEST: # nested emission events immediately trigger the next nested level # and never return to process the remainder of the current loop assert a == [1, 2, 20, 3, 30, 300] elif strategy == ReemissionMode.QUEUED: # all callbacks are called once before the next one is called assert a == [1, 10, 100, 2, 20, 200, 3, 30, 300] @pytest.mark.parametrize("strategy", [ReemissionMode.QUEUED, ReemissionMode.IMMEDIATE]) def test_signal_order_suspend(strategy: ReemissionMode) -> None: """Test that signals are emitted in the order they were connected.""" sig = SignalInstance((int,), reemission=strategy) mock1 = Mock() mock2 = Mock() def callback(x): if x < 10: sig.emit(10) def callback2(x): if x == 10: sig.emit(11) def callback3(x): if x == 10: with sig.paused(reducer=lambda a, b: (a[0] + b[0],)): for i in range(12, 15): sig.emit(i) sig.connect(mock1) sig.connect(callback) sig.connect(callback2) sig.connect(callback3) sig.connect(mock2) sig.emit(1) mock1.assert_has_calls([call(1), call(10), call(11), call(39)]) if strategy == ReemissionMode.IMMEDIATE: mock2.assert_has_calls([call(11), call(39), call(10), call(1)]) elif strategy == ReemissionMode.QUEUED: mock2.assert_has_calls([call(1), call(10), call(11), call(39)]) def test_call_priority() -> None: """Test that signals are emitted in the order they were connected.""" emitter = Emitter() calls = [] emitter.no_arg.connect(lambda: calls.append(1)) emitter.no_arg.connect(lambda: calls.append(2), priority=5) emitter.no_arg.connect(lambda: calls.append(3), priority=-5) emitter.no_arg.connect(lambda: calls.append(4), priority=10) emitter.no_arg.emit() assert calls == [4, 2, 1, 3] def test_slotted_classes() -> None: class T: __slots__ = ("not_sig",) sig = Signal() t = T() mock = Mock() @t.sig.connect def f(): mock() t.sig.emit() mock.assert_called_once() assert t.sig is t.sig def test_emit_should_not_prevent_gc(): from weakref import WeakSet from psygnal import Signal class Obj: pass class SomethingWithSignal: changed = Signal(object) object_instances: WeakSet[Obj] = WeakSet() something = SomethingWithSignal() obj = Obj() object_instances.add(obj) something.changed.emit(obj) del obj assert len(object_instances) == 0 @pytest.mark.parametrize("strategy", ReemissionMode._members()) def test_emit_loop_error_message_construction(strategy: ReemissionVal) -> None: sig = SignalInstance((int,), reemission=strategy) sig.connect(lambda v: v == 1 and sig.emit(2)) # type: ignore sig.connect(lambda v: v == 2 and sig.emit(0)) # type: ignore sig.connect(lambda v: 1 / v) with pytest.raises(EmitLoopError, match="While emitting signal") as e: sig.emit(1) if strategy == "queued": # check that we show a useful message for confusign queued signals assert "NOTE" in str(e.value) def test_description(): description = "A signal" class T: sig = Signal(description=description) assert T.sig.description == description assert T().sig.description == description psygnal-0.15.0/tests/test_pydantic_support.py0000644000000000000000000000620215073705675016401 0ustar00from typing import Any, get_origin import pytest try: import pydantic except ImportError: pytest.skip("pydantic not installed", allow_module_level=True) from psygnal import containers V1 = pydantic.__version__.startswith("1") @pytest.mark.skipif(V1, reason="pydantic v1 has poor support for generics") @pytest.mark.parametrize( "hint", [ containers.EventedList[int], containers.SelectableEventedList[int], ], ) def test_evented_list_as_pydantic_field(hint: Any) -> None: class Model(pydantic.BaseModel): my_list: hint m = Model(my_list=[1, 2, 3]) # type: ignore assert m.my_list == [1, 2, 3] assert isinstance(m.my_list, get_origin(hint)) m2 = Model(my_list=containers.EventedList([1, 2, 3])) assert m2.my_list == [1, 2, 3] m3 = Model(my_list=[1, "2", 3]) # type: ignore assert m3.my_list == [1, 2, 3] assert isinstance(m3.my_list, get_origin(hint)) with pytest.raises(pydantic.ValidationError): Model(my_list=[1, 2, "string"]) # type: ignore @pytest.mark.skipif(V1, reason="pydantic v1 has poor support for generics") def test_evented_list_no_params_as_pydantic_field() -> None: class Model(pydantic.BaseModel): my_list: containers.EventedList m = Model(my_list=[1, 2, 3]) # type: ignore assert m.my_list == [1, 2, 3] assert isinstance(m.my_list, containers.EventedList) m3 = Model(my_list=[1, "string", 3]) # type: ignore assert m3.my_list == [1, "string", 3] assert isinstance(m3.my_list, containers.EventedList) @pytest.mark.skipif(V1, reason="pydantic v1 has poor support for generics") @pytest.mark.parametrize( "hint", [ containers.EventedSet[str], containers.EventedOrderedSet[str], containers.Selection[str], ], ) def test_evented_set_as_pydantic_field(hint: Any) -> None: class Model(pydantic.BaseModel): my_set: hint model_config = {"coerce_numbers_to_str": True} m = Model(my_set=[1, 2]) # type: ignore assert m.my_set == {"1", "2"} # type: ignore assert isinstance(m.my_set, get_origin(hint)) m2 = Model(my_set=containers.EventedSet(["a", "b"])) assert m2.my_set == {"a", "b"} # type: ignore m3 = Model(my_set=[1, "2", 3]) # type: ignore assert m3.my_set == {"1", "2", "3"} # type: ignore assert isinstance(m3.my_set, get_origin(hint)) @pytest.mark.skipif(V1, reason="pydantic v1 has poor support for generics") def test_evented_dict_as_pydantic_field() -> None: class Model(pydantic.BaseModel): my_dict: containers.EventedDict[str, int] model_config = {"coerce_numbers_to_str": True} m = Model(my_dict={"a": 1}) # type: ignore assert m.my_dict == {"a": 1} assert isinstance(m.my_dict, containers.EventedDict) m2 = Model(my_dict=containers.EventedDict({"a": 1})) assert m2.my_dict == {"a": 1} assert isinstance(m2.my_dict, containers.EventedDict) m3 = Model(my_dict={1: "2"}) # type: ignore assert m3.my_dict == {"1": 2} assert isinstance(m3.my_dict, containers.EventedDict) with pytest.raises(pydantic.ValidationError): Model(my_dict={"a": "string"}) # type: ignore psygnal-0.15.0/tests/test_pyinstaller_hook.py0000644000000000000000000000275315073705675016367 0ustar00import importlib.util import os import subprocess import warnings from pathlib import Path import pytest import psygnal def test_hook_content(): spec = importlib.util.spec_from_file_location( "hook", os.path.join( os.path.dirname(psygnal.__file__), "_pyinstaller_util", "hook-psygnal.py" ), ) hook = importlib.util.module_from_spec(spec) spec.loader.exec_module(hook) assert "mypy_extensions" in hook.hiddenimports if not psygnal._compiled: return assert "psygnal._dataclass_utils" in hook.hiddenimports @pytest.mark.skipif(not os.getenv("CI"), reason="slow test") def test_pyintstaller_hiddenimports(tmp_path: Path) -> None: with warnings.catch_warnings(): warnings.simplefilter("ignore") pyi_main = pytest.importorskip("PyInstaller.__main__") build_path = tmp_path / "build" dist_path = tmp_path / "dist" app_name = "psygnal_test" app = tmp_path / f"{app_name}.py" app.write_text("\n".join(["import psygnal", "print(psygnal.__version__)"])) args = [ # Place all generated files in ``tmp_path``. "--workpath", str(build_path), "--distpath", str(dist_path), "--specpath", str(tmp_path), str(app), ] with warnings.catch_warnings(): warnings.simplefilter("ignore") # silence warnings about deprecations pyi_main.run(args) subprocess.run([str(dist_path / app_name / app_name)], check=True) psygnal-0.15.0/tests/test_qt_compat.py0000644000000000000000000001020615073705675014760 0ustar00"""qtbot should work for testing!""" from collections.abc import Callable from threading import Thread, current_thread, main_thread from typing import TYPE_CHECKING, Any, Literal from unittest.mock import Mock import pytest from psygnal import Signal from psygnal._signal import _guess_qtsignal_signature pytest.importorskip("pytestqt") if TYPE_CHECKING: from pytestqt.qtbot import QtBot def _equals(*val: Any) -> Callable[[tuple[Any, ...]], bool]: def _inner(*other: Any) -> bool: return other == val return _inner def test_wait_signals(qtbot: "QtBot") -> None: class Emitter: sig1 = Signal() sig2 = Signal(int) sig3 = Signal(int, int) e = Emitter() with qtbot.waitSignal(e.sig2, check_params_cb=_equals(1)): e.sig2.emit(1) with qtbot.waitSignal(e.sig3, check_params_cb=_equals(2, 3)): e.sig3.emit(2, 3) with qtbot.waitSignals([e.sig3], check_params_cbs=[_equals(2, 3)]): e.sig3.emit(2, 3) signals = [e.sig1, e.sig2, e.sig3, e.sig1] checks = [_equals(), _equals(1), _equals(2, 3), _equals()] with qtbot.waitSignals(signals, check_params_cbs=checks, order="strict"): e.sig1.emit() e.sig2.emit(1) e.sig3.emit(2, 3) e.sig1.emit() def test_guess_signal_sig(qtbot: "QtBot") -> None: from qtpy import QtCore class QtObject(QtCore.QObject): qsig1 = QtCore.Signal() qsig2 = QtCore.Signal(int) qsig3 = QtCore.Signal(int, str) q_obj = QtObject() assert "qsig1()" in _guess_qtsignal_signature(q_obj.qsig1) assert "qsig1()" in _guess_qtsignal_signature(q_obj.qsig1.emit) assert "qsig2(int)" in _guess_qtsignal_signature(q_obj.qsig2) assert "qsig2(int)" in _guess_qtsignal_signature(q_obj.qsig2.emit) assert "qsig3(int,QString)" in _guess_qtsignal_signature(q_obj.qsig3) assert "qsig3(int,QString)" in _guess_qtsignal_signature(q_obj.qsig3.emit) def test_connect_qt_signal_instance(qtbot: "QtBot") -> None: from qtpy import QtCore class Emitter: sig1 = Signal() sig2 = Signal(int) sig3 = Signal(int, int) class QtObject(QtCore.QObject): qsig1 = QtCore.Signal() qsig2 = QtCore.Signal(int) q_obj = QtObject() e = Emitter() # the hard case: signal.emit takes less args than we emit def test_receives_1(value: int) -> bool: # making sure that qsig2.emit only receives and emits 1 value return value == 1 e.sig3.connect(q_obj.qsig2.emit) with qtbot.waitSignal(q_obj.qsig2, check_params_cb=test_receives_1): e.sig3.emit(1, 2) # too many # the "standard" cases, where params match e.sig1.connect(q_obj.qsig1.emit) with qtbot.waitSignal(q_obj.qsig1): e.sig1.emit() e.sig2.connect(q_obj.qsig2.emit) with qtbot.waitSignal(q_obj.qsig2): e.sig2.emit(1) # the flip case: signal.emit takes more args than we emit with pytest.raises(ValueError): e.sig1.connect(q_obj.qsig2.emit) e.sig1.emit() @pytest.mark.parametrize("thread", [None, "main"]) def test_q_main_thread_emit( thread: Literal["main", None], qtbot: "QtBot", qapp ) -> None: """Test using signal.emit(..., queue=True) ... and receiving it on the main thread with a QTimer connected to `emit_queued` """ from psygnal.qt import start_emitting_from_queue, stop_emitting_from_queue class C: sig = Signal(int) obj = C() mock = Mock() @obj.sig.connect(thread=thread) def _some_slot(val: int) -> None: mock(val) assert (current_thread() == main_thread()) == (thread == "main") def _emit_from_thread() -> None: assert current_thread() != main_thread() obj.sig.emit(1) with qtbot.waitSignal(obj.sig, timeout=1000): t = Thread(target=_emit_from_thread) t.start() t.join() qapp.processEvents() if thread is None: mock.assert_called_once_with(1) else: mock.assert_not_called() start_emitting_from_queue() qapp.processEvents() mock.assert_called_once_with(1) start_emitting_from_queue(10) # just for test coverage stop_emitting_from_queue() psygnal-0.15.0/tests/test_testing_utils.py0000644000000000000000000001261515073705675015674 0ustar00import re import pytest import psygnal.testing as pt from psygnal import Signal class MyObject: changed = Signal() value_changed = Signal(int) def test_assert_emitted() -> None: obj = MyObject() with pt.assert_emitted(obj.changed) as tester: obj.changed.emit() assert isinstance(tester, pt.SignalTester) with pytest.raises( AssertionError, match=r"Expected 'changed' to have been emitted." ): with pt.assert_emitted(obj.changed): pass def test_assert_emitted_once(): obj = MyObject() with pt.assert_emitted_once(obj.changed) as tester: obj.changed.emit() assert isinstance(tester, pt.SignalTester) with pytest.raises( AssertionError, match=r"Expected 'changed' to have been emitted once. Emitted 2 times.", ): with pt.assert_emitted_once(obj.changed): obj.changed.emit() obj.changed.emit() def test_assert_not_emitted() -> None: obj = MyObject() with pt.assert_not_emitted(obj.changed) as tester: pass assert isinstance(tester, pt.SignalTester) with pytest.raises( AssertionError, match=r"Expected 'changed' to not have been emitted. Emitted once.", ): with pt.assert_not_emitted(obj.changed): obj.changed.emit() with pytest.raises( AssertionError, match=r"Expected 'changed' to not have been emitted. Emitted 4 times.", ): with pt.assert_not_emitted(obj.changed): obj.changed.emit() obj.changed.emit() obj.changed.emit() obj.changed.emit() def test_assert_emitted_with() -> None: obj = MyObject() with pt.assert_emitted_with(obj.value_changed, 42) as tester: obj.value_changed.emit(41) obj.value_changed.emit(42) assert isinstance(tester, pt.SignalTester) with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted with arguments (42,)." "\nActual: not emitted" ), ): with pt.assert_emitted_with(obj.value_changed, 42): pass with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted with arguments (42,)." "\nActual: (43,)" ), ): with pt.assert_emitted_with(obj.value_changed, 42): obj.value_changed.emit(42) obj.value_changed.emit(43) def test_assert_emitted_once_with() -> None: obj = MyObject() with pt.assert_emitted_once_with(obj.value_changed, 42) as tester: obj.value_changed.emit(42) assert isinstance(tester, pt.SignalTester) with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted exactly once. " "Emitted 2 times." ), ): with pt.assert_emitted_once_with(obj.value_changed, 42): obj.value_changed.emit(42) obj.value_changed.emit(42) with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted once with arguments (42,)." "\nActual: (43,)" ), ): with pt.assert_emitted_once_with(obj.value_changed, 42): obj.value_changed.emit(43) def test_assert_ever_emitted_with() -> None: obj = MyObject() with pt.assert_ever_emitted_with(obj.value_changed, 42) as tester: obj.value_changed.emit(41) obj.value_changed.emit(42) obj.value_changed.emit(43) assert isinstance(tester, pt.SignalTester) with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted at least once with " "arguments (42,)." "\nActual: not emitted" ), ): with pt.assert_ever_emitted_with(obj.value_changed, 42): pass with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted at least once with " "arguments (42,)." "\nActual: (43,)" ), ): with pt.assert_ever_emitted_with(obj.value_changed, 42): obj.value_changed.emit(43) with pytest.raises( AssertionError, match=re.escape( "Expected 'value_changed' to have been emitted at least once with " "arguments (42,)." "\nActual: [(41,), (42, 43)]" ), ): with pt.assert_ever_emitted_with(obj.value_changed, 42): obj.value_changed.emit(41) obj.value_changed.emit(42, 43) def test_signal_tester() -> None: obj = MyObject() tester = pt.SignalTester(obj.changed) tester.connect() assert tester.signal_name == "changed" assert tester.mock.call_count == 0 obj.changed.emit() tester.assert_emitted_once() tester.assert_emitted() tester.assert_emitted_with() assert tester.emit_count == 1 tester.reset() assert tester.emit_count == 0 tester2 = pt.SignalTester(obj.value_changed) with tester2: obj.value_changed.emit(42) obj.value_changed.emit(43) tester2.assert_emitted() tester2.assert_emitted_with(43) tester2.assert_ever_emitted_with(42) assert tester2.emit_args_list == [(42,), (43,)] assert tester2.emit_count == 2 assert tester2.emit_args == (43,) psygnal-0.15.0/tests/test_throttler.py0000644000000000000000000000450015073705675015020 0ustar00import time from collections.abc import Callable from inspect import Parameter, signature from unittest.mock import Mock import pytest from psygnal import SignalInstance, _compiled, debounced, throttled def test_debounced() -> None: mock1 = Mock() f1 = debounced(mock1, timeout=10, leading=False) f2 = Mock() for _ in range(10): f1() f2() time.sleep(0.1) mock1.assert_called_once() assert f2.call_count == 10 def test_debounced_leading() -> None: mock1 = Mock() f1 = debounced(mock1, timeout=10, leading=True) f2 = Mock() for _ in range(10): f1() f2() time.sleep(0.1) assert mock1.call_count == 2 assert f2.call_count == 10 def test_throttled() -> None: mock1 = Mock() f1 = throttled(mock1, timeout=10, leading=True) f2 = Mock() for _ in range(10): f1() f2() time.sleep(0.1) assert mock1.call_count == 2 assert f2.call_count == 10 def test_throttled_trailing() -> None: mock1 = Mock() f1 = throttled(mock1, timeout=10, leading=False) f2 = Mock() for _ in range(10): f1() f2() time.sleep(0.1) assert mock1.call_count == 1 assert f2.call_count == 10 def test_cancel() -> None: mock1 = Mock() f1 = debounced(mock1, timeout=50, leading=False) f1() f1() f1.cancel() time.sleep(0.2) mock1.assert_not_called() def test_flush() -> None: mock1 = Mock() f1 = debounced(mock1, timeout=50, leading=False) f1() f1() f1.flush() time.sleep(0.2) mock1.assert_called_once() @pytest.mark.parametrize("deco", [debounced, throttled]) def test_throttled_debounced_signature(deco: Callable) -> None: mock = Mock() @deco(timeout=0, leading=True) def f1(x: int) -> None: """Doc.""" mock(x) # make sure we can still inspect the signature assert signature(f1).parameters["x"] == Parameter( "x", Parameter.POSITIONAL_OR_KEYWORD, annotation=int ) # make sure these are connectable sig = SignalInstance((int, int, int)) sig.connect(f1) sig.emit(1, 2, 3) mock.assert_called_once_with(1) if not _compiled: # unfortunately, dynamic assignment of __doc__ and stuff isn't possible in mypyc assert f1.__doc__ == "Doc." assert f1.__name__ == "f1" psygnal-0.15.0/tests/test_utils.py0000644000000000000000000000772215073705675014142 0ustar00import os import sys from pathlib import Path from unittest.mock import Mock, call import pytest from psygnal import EmissionInfo, Signal, SignalGroup from psygnal.utils import decompile, monitor_events, recompile def test_event_debugger(capsys) -> None: """Test that the event debugger works""" class M: sig = Signal(int, int) m = M() _logger = Mock() assert not m.sig._slots with monitor_events(m, _logger): assert len(m.sig._slots) == 1 m.sig.emit(1, 2) m.sig.emit(3, 4) assert _logger.call_count == 2 _logger.assert_has_calls( [call(EmissionInfo(m.sig, (1, 2))), call(EmissionInfo(m.sig, (3, 4)))] ) assert not m.sig._slots with monitor_events(m): m.sig.emit(1, 2) m.sig.emit(3, 4) captured = capsys.readouterr() assert captured.out == "sig.emit(1, 2)\nsig.emit(3, 4)\n" def test_old_monitor_api_dep_warning() -> None: class M: sig = Signal(int, int) mock = Mock() def _monitor(signal_name: str, args: tuple) -> None: mock(signal_name, args) m = M() with pytest.warns( UserWarning, match="logger functions must now take a single argument" ): with monitor_events(m, logger=_monitor): # type: ignore m.sig.emit(1, 2) mock.assert_called_once_with("sig", (1, 2)) with pytest.raises(ValueError, match="logger function must take a single argument"): with monitor_events(logger=_monitor): # type: ignore m.sig.emit(1, 2) mock.reset_mock() with monitor_events(m, logger=mock): m.sig.emit(1, 2) mock.assert_called_once_with(EmissionInfo(m.sig, (1, 2))) # global monitor mock.reset_mock() with monitor_events(logger=mock): m.sig.emit(1, 2) mock.assert_called_once_with(EmissionInfo(m.sig, (1, 2))) def test_monitor_all() -> None: class M: sig = Signal(int, int) m1 = M() m2 = M() _logger = Mock() with monitor_events(logger=_logger): m1.sig.emit(1, 2) m2.sig.emit(3, 4) m1.sig.emit(5, 6) m2.sig.emit(7, 8) assert _logger.call_args_list == [ call(EmissionInfo(m1.sig, (1, 2))), call(EmissionInfo(m2.sig, (3, 4))), call(EmissionInfo(m1.sig, (5, 6))), call(EmissionInfo(m2.sig, (7, 8))), ] def test_monitor_group() -> None: class MyGroup(SignalGroup): sig1 = Signal(int, int) sig2 = Signal(str, str) m1 = MyGroup() m2 = MyGroup() _logger = Mock() with monitor_events(logger=_logger): m1.sig1.emit(1, 2) m2.sig1.emit(3, 4) m1.sig1.emit(5, 6) m2.sig1.emit(7, 8) m1.sig2.emit("9", "10") m2.sig2.emit("11", "12") assert _logger.call_args_list == [ call(EmissionInfo(m1.sig1, (1, 2))), call(EmissionInfo(m2.sig1, (3, 4))), call(EmissionInfo(m1.sig1, (5, 6))), call(EmissionInfo(m2.sig1, (7, 8))), call(EmissionInfo(m1.sig2, ("9", "10"))), call(EmissionInfo(m2.sig2, ("11", "12"))), ] @pytest.mark.skipif(os.name == "nt", reason="rewrite open files on Windows is buggy") def test_decompile_recompile(monkeypatch): import psygnal was_compiled = psygnal._compiled decompile() monkeypatch.delitem(sys.modules, "psygnal") monkeypatch.delitem(sys.modules, "psygnal._signal") import psygnal assert not psygnal._compiled if was_compiled: assert list(Path(psygnal.__file__).parent.rglob("**/*_BAK")) recompile() monkeypatch.delitem(sys.modules, "psygnal") monkeypatch.delitem(sys.modules, "psygnal._signal") import psygnal assert psygnal._compiled def test_debug_import(monkeypatch): """Test that PSYGNAL_UNCOMPILED gives a warning.""" monkeypatch.delitem(sys.modules, "psygnal") monkeypatch.setenv("PSYGNAL_UNCOMPILED", "1") with pytest.warns(UserWarning, match="PSYGNAL_UNCOMPILED no longer has any effect"): import psygnal # noqa: F401 psygnal-0.15.0/tests/test_weak_callable.py0000644000000000000000000001403115073705675015537 0ustar00import gc import re from functools import partial from typing import Any from unittest.mock import Mock from weakref import ref import pytest from psygnal import SignalInstance from psygnal._weak_callback import WeakCallback, weak_callback @pytest.mark.parametrize( "type_", [ "function", "toolz_function", "weak_func", "lambda", "method", "partial_method", "toolz_method", "setattr", "setitem", "mock", "weak_cb", "print", ], ) def test_slot_types(type_: str, capsys: Any) -> None: mock = Mock() final_mock = Mock() class MyObj: def method(self, x: int) -> None: mock(x) return x def __setitem__(self, key, value): mock(value) return value def __setattr__(self, __name: str, __value) -> None: if __name == "x": mock(__value) return __value obj = MyObj() if type_ == "setattr": cb = weak_callback(setattr, obj, "x", finalize=final_mock) elif type_ == "setitem": cb = weak_callback(obj.__setitem__, "x", finalize=final_mock) elif type_ in {"function", "weak_func"}: def obj(x: int) -> None: mock(x) return x cb = weak_callback(obj, strong_func=(type_ == "function"), finalize=final_mock) elif type_ == "toolz_function": toolz = pytest.importorskip("toolz") @toolz.curry def obj(z: int, x: int) -> None: mock(x) return x cb = weak_callback(obj(5), finalize=final_mock) elif type_ == "lambda": cb = weak_callback(lambda x: mock(x) and x, finalize=final_mock) elif type_ == "method": cb = weak_callback(obj.method, finalize=final_mock) elif type_ == "partial_method": cb = weak_callback(partial(obj.method, 2), max_args=0, finalize=final_mock) elif type_ == "toolz_method": toolz = pytest.importorskip("toolz") cb = weak_callback(toolz.curry(obj.method, 2), max_args=0, finalize=final_mock) elif type_ == "mock": cb = weak_callback(mock, finalize=final_mock) elif type_ == "weak_cb": cb = weak_callback(obj.method, finalize=final_mock) cb = weak_callback(cb, finalize=final_mock) elif type_ == "print": cb = weak_callback(print, finalize=final_mock) assert isinstance(cb, WeakCallback) assert isinstance(cb.slot_repr(), str) cb.cb((2,)) assert cb.dereference() is not None if type_ == "print": assert capsys.readouterr().out == "2\n" return mock.assert_called_once_with(2) mock.reset_mock() result = cb(2) if type_ not in ("setattr", "mock"): assert result == 2 mock.assert_called_once_with(2) del obj if type_ not in ("function", "toolz_function", "lambda", "mock"): final_mock.assert_called_once_with(cb) assert cb.dereference() is None with pytest.raises(ReferenceError): cb.cb((2,)) with pytest.raises(ReferenceError): cb(2) else: cb.cb((4,)) mock.assert_called_with(4) def test_weak_callable_equality() -> None: """Slot callers should be equal only if they represent the same bound-method.""" class T: def x(self): ... t1 = T() t2 = T() t1_ref = ref(t1) t2_ref = ref(t2) bmt1_a = weak_callback(t1.x) bmt1_b = weak_callback(t1.x) bmt2_a = weak_callback(t2.x) bmt2_b = weak_callback(t2.x) assert bmt1_a != "not a weak callback" def _assert_equality() -> None: assert bmt1_a == bmt1_b assert bmt2_a == bmt2_b assert bmt1_a != bmt2_a assert bmt1_b != bmt2_b _assert_equality() del t1 gc.collect() assert t1_ref() is None _assert_equality() del t2 gc.collect() assert t2_ref() is None _assert_equality() def test_nonreferencable() -> None: class T: __slots__ = ("x",) def method(self) -> None: ... t = T() with pytest.warns(UserWarning, match="failed to create weakref"): cb = weak_callback(t.method) assert cb.dereference() == t.method with pytest.raises(TypeError): weak_callback(t.method, on_ref_error="raise") cb = weak_callback(t.method, on_ref_error="ignore") assert cb.dereference() == t.method @pytest.mark.parametrize("strong", [True, False]) def test_deref(strong: bool) -> None: def func(x): ... p = partial(func, 1) cb = weak_callback(p, strong_func=strong) dp = cb.dereference() assert dp.func is p.func assert dp.args == p.args assert dp.keywords == p.keywords def test_queued_callbacks() -> None: from psygnal._queue import QueuedCallback def func(x): return x cb = weak_callback(func) qcb = QueuedCallback(cb, thread="current") assert qcb.dereference() is func assert qcb(1) == 1 def test_cb_raises() -> None: from psygnal import EmitLoopError sig = SignalInstance((int,), name="sig") class T: @property def x(self) -> int: return 1 @x.setter def x(self, value: int) -> None: 1 / value def __setitem__(self, key: str, value: int) -> Any: 1 / value def method(self, x: int) -> None: 1 / x t = T() sig.connect(t.method) error_re = re.compile( f"emitting signal.*'sig'.*{re.escape(__file__)}.*method", re.DOTALL ) with pytest.raises(EmitLoopError, match=error_re): sig.emit("a") sig.disconnect(t.method) sig.connect_setattr(t, "x", maxargs=1) error_re = re.compile( f"emitting signal.*'sig'.*{re.escape(__file__)}.*x", re.DOTALL ) with pytest.raises(EmitLoopError, match=error_re): sig.emit("a") sig.disconnect_setattr(t, "x") sig.connect_setitem(t, "x", maxargs=1) error_re = re.compile( f"emitting signal.*'sig'.*{re.escape(__file__)}.*__setitem__", re.DOTALL ) with pytest.raises(EmitLoopError, match=error_re): sig.emit("a") psygnal-0.15.0/tests/containers/test_evented_dict.py0000644000000000000000000001013715073705675017576 0ustar00from copy import copy from unittest.mock import Mock import pytest from psygnal.containers._evented_dict import EventedDict @pytest.fixture def regular_dict(): return {"A": 1, "B": 2, "C": 3} @pytest.fixture def test_dict(regular_dict): """EventedDict without basetype set.""" test_dict = EventedDict(regular_dict) test_dict.events = Mock(wraps=test_dict.events) return test_dict @pytest.mark.parametrize( "method_name, args, expected", [ ("__getitem__", ("A",), 1), # read ("__setitem__", ("A", 3), None), # update ("__setitem__", ("D", 3), None), # add new entry ("__delitem__", ("A",), None), # delete ("__len__", (), 3), ("__newlike__", ({"A": 1},), {"A": 1}), ("copy", (), {"A": 1, "B": 2, "C": 3}), ], ) def test_dict_interface_parity(regular_dict, test_dict, method_name, args, expected): """Test that EventedDict interface is equivalent to the builtin dict.""" test_dict_method = getattr(test_dict, method_name) assert test_dict == regular_dict if hasattr(regular_dict, method_name): regular_dict_method = getattr(regular_dict, method_name) assert test_dict_method(*args) == regular_dict_method(*args) == expected assert test_dict == regular_dict else: test_dict_method(*args) # smoke test def test_dict_inits(): a = EventedDict({"A": 1, "B": 2, "C": 3}) b = EventedDict(A=1, B=2, C=3) c = EventedDict({"A": 1}, B=2, C=3) assert a == b == c def test_dict_repr(test_dict): assert repr(test_dict) == "EventedDict({'A': 1, 'B': 2, 'C': 3})" def test_instantiation_without_data(): """Test that EventedDict can be instantiated without data.""" test_dict = EventedDict() assert isinstance(test_dict, EventedDict) def test_basetype_enforcement_on_instantiation(): """EventedDict with basetype set should enforce types on instantiation.""" with pytest.raises(TypeError): test_dict = EventedDict({"A": "not an int"}, basetype=int) test_dict = EventedDict({"A": 1}) assert isinstance(test_dict, EventedDict) def test_basetype_enforcement_on_set_item(): """EventedDict with basetype set should enforces types on setitem.""" test_dict = EventedDict(basetype=int) test_dict["A"] = 1 with pytest.raises(TypeError): test_dict["A"] = "not an int" def test_dict_add_events(test_dict): """Test that events are emitted before and after an item is added.""" test_dict.events.adding.emit = Mock(wraps=test_dict.events.adding.emit) test_dict.events.added.emit = Mock(wraps=test_dict.events.added.emit) test_dict["D"] = 4 test_dict.events.adding.emit.assert_called_with("D") test_dict.events.added.emit.assert_called_with("D", 4) test_dict.events.adding.emit.reset_mock() test_dict.events.added.emit.reset_mock() test_dict["D"] = 4 test_dict.events.adding.emit.assert_not_called() test_dict.events.added.emit.assert_not_called() def test_dict_change_events(test_dict): """Test that events are emitted when an item in the dictionary is replaced.""" # events shouldn't be emitted on addition test_dict.events.changing.emit = Mock(wraps=test_dict.events.changing.emit) test_dict.events.changed.emit = Mock(wraps=test_dict.events.changed.emit) test_dict["D"] = 4 test_dict.events.changing.emit.assert_not_called() test_dict.events.changed.emit.assert_not_called() test_dict["C"] = 4 test_dict.events.changing.emit.assert_called_with("C") test_dict.events.changed.emit.assert_called_with("C", 3, 4) def test_dict_remove_events(test_dict): """Test that events are emitted before and after an item is removed.""" test_dict.events.removing.emit = Mock(wraps=test_dict.events.removing.emit) test_dict.events.removed.emit = Mock(wraps=test_dict.events.removed.emit) test_dict.pop("C") test_dict.events.removing.emit.assert_called_with("C") test_dict.events.removed.emit.assert_called_with("C", 3) def test_copy_no_sync(): d1 = EventedDict({1: 1, 2: 2, 3: 3}) d2 = copy(d1) d1[4] = 4 d1[3] = 4 assert len(d2) == 3 assert d2[3] == 3 psygnal-0.15.0/tests/containers/test_evented_list.py0000644000000000000000000003002515073705675017624 0ustar00import os from copy import copy from typing import cast from unittest.mock import Mock, call import numpy as np import pytest from psygnal import EmissionInfo, PathStep, Signal, SignalGroup from psygnal.containers import EventedList @pytest.fixture def regular_list(): return list(range(5)) @pytest.fixture def test_list(regular_list): test_list = EventedList(regular_list) test_list.events = Mock(wraps=test_list.events) return test_list @pytest.mark.parametrize( "meth", [ # METHOD, ARGS, EXPECTED EVENTS # primary interface ("insert", (2, 10), ("inserting", "inserted")), # create ("__getitem__", (2,), ()), # read ("__setitem__", (2, 3), ("changed",)), # update ("__setitem__", (slice(2), [1, 2]), ("changed",)), # update slice ("__setitem__", (slice(2, 2), [1, 2]), ("changed",)), # update slice ("__delitem__", (2,), ("removing", "removed")), # delete ( "__delitem__", (slice(2),), ("removing", "removed") * 2, ), ("__delitem__", (slice(0, 0),), ("removing", "removed")), ( "__delitem__", (slice(-3),), ("removing", "removed") * 2, ), ( "__delitem__", (slice(-2, None),), ("removing", "removed") * 2, ), # inherited interface ("append", (3,), ("inserting", "inserted")), ("clear", (), ("removing", "removed") * 5), ("count", (3,), ()), ("extend", ([7, 8, 9],), ("inserting", "inserted") * 3), ("index", (3,), ()), ("pop", (-2,), ("removing", "removed")), ("remove", (3,), ("removing", "removed")), ("reverse", (), ("reordered",)), ("__add__", ([7, 8, 9],), ()), ("__iadd__", ([7, 9],), ("inserting", "inserted") * 2), ("__radd__", ([7, 9],), ("inserting", "inserted") * 2), # sort? ], ids=lambda x: x[0], ) def test_list_interface_parity(test_list, regular_list, meth): test_list.events = cast("Mock", test_list.events) method_name, args, expected = meth test_list_method = getattr(test_list, method_name) assert tuple(test_list) == tuple(regular_list) if hasattr(regular_list, method_name): regular_list_method = getattr(regular_list, method_name) assert test_list_method(*args) == regular_list_method(*args) assert tuple(test_list) == tuple(regular_list) else: test_list_method(*args) # smoke test for c, expect in zip(test_list.events.call_args_list, expected, strict=False): event = c.args[0] assert event.type == expect def test_delete(test_list): assert test_list == [0, 1, 2, 3, 4] del test_list[1] assert test_list == [0, 2, 3, 4] del test_list[2:] assert test_list == [0, 2] @pytest.mark.xfail("i686" in os.getenv("AUDITWHEEL_PLAT", ""), reason="failing on i686") def test_hash(test_list): assert id(test_list) == hash(test_list) b = EventedList([2, 3], hashable=False) with pytest.raises(TypeError): hash(b) def test_repr(test_list): assert repr(test_list) == "EventedList([0, 1, 2, 3, 4])" def test_reverse(test_list): assert test_list == [0, 1, 2, 3, 4] test_list.reverse() test_list.events.reordered.emit.assert_called_once() test_list.events.changed.emit.assert_not_called() assert test_list == [4, 3, 2, 1, 0] test_list.events.reordered.emit.reset_mock() test_list.reverse(emit_individual_events=True) test_list.events.reordered.emit.assert_called_once() assert test_list.events.changed.emit.call_count == 4 test_list.events.changed.emit.assert_has_calls( [call(0, 4, 0), call(4, 0, 4), call(1, 3, 1), call(3, 1, 3)] ) assert test_list == [0, 1, 2, 3, 4] def test_list_interface_exceptions(test_list): bad_index = {"a": "dict"} with pytest.raises(TypeError): test_list[bad_index] with pytest.raises(TypeError): test_list[bad_index] = 1 with pytest.raises(TypeError): del test_list[bad_index] with pytest.raises(TypeError): test_list.insert([bad_index], 0) def test_copy(test_list, regular_list): """Copying an evented list should return a same-class evented list.""" new_test = test_list.copy() new_reg = regular_list.copy() assert id(new_test) != id(test_list) assert new_test == test_list assert tuple(new_test) == tuple(test_list) == tuple(new_reg) test_list.events.assert_not_called() def test_array_like_setitem(): """Test that EventedList.__setitem__ works for array-like items""" array = np.array((10, 10)) evented_list = EventedList([array]) evented_list[0] = array def test_slice(test_list, regular_list): """Slicing an evented list should return a same-class evented list.""" test_slice = test_list[1:3] regular_slice = regular_list[1:3] assert tuple(test_slice) == tuple(regular_slice) assert isinstance(test_slice, test_list.__class__) change_emit = test_list.events.changed.emit assert test_list == [0, 1, 2, 3, 4] test_list[1:3] = [6, 7, 8] assert test_list == [0, 6, 7, 8, 3, 4] change_emit.assert_called_with(slice(1, 3, None), [1, 2], [6, 7, 8]) with pytest.raises(ValueError) as e: test_list[1:6:2] = [6, 7, 8, 6, 7] assert str(e.value).startswith("attempt to assign sequence of size 5 to extended ") test_list[1:6:2] = [9, 9, 9] assert test_list == [0, 9, 7, 9, 3, 9] change_emit.assert_called_with(slice(1, 6, 2), [6, 8, 4], [9, 9, 9]) with pytest.raises(TypeError) as e2: test_list[1:3] = 1 assert str(e2.value) == "Can only assign an iterable to slice" def test_move(test_list: EventedList) -> None: """Test the that we can move objects with the move method""" test_list.events = cast("Mock", test_list.events) def _fail() -> None: raise AssertionError("unexpected event called") test_list.events.removing.connect(_fail) test_list.events.removed.connect(_fail) test_list.events.inserting.connect(_fail) test_list.events.inserted.connect(_fail) before = list(test_list) assert before == [0, 1, 2, 3, 4] # from fixture # pop the object at 0 and insert at current position 3 test_list.move(0, 3) expectation = [1, 2, 0, 3, 4] assert test_list != before assert test_list == expectation test_list.events.moving.emit.assert_called_once_with(0, 3) test_list.events.moved.emit.assert_called_once_with(0, 2, 0) test_list.events.reordered.emit.assert_called_once() test_list.events.moving.emit.reset_mock() test_list.move(2, 2) test_list.events.moving.emit.assert_not_called() # noop # move the other way # pop the object at 3 and insert at current position 0 assert test_list == [1, 2, 0, 3, 4] test_list.move(3, 0) assert test_list == [3, 1, 2, 0, 4] # negative index destination test_list.move(1, -2) assert test_list == [3, 2, 0, 1, 4] BASIC_INDICES: list[tuple] = [ ((2,), 0, [2, 0, 1, 3, 4, 5, 6, 7]), # move single item ([0, 2, 3], 6, [1, 4, 5, 0, 2, 3, 6, 7]), # move back ([4, 7], 1, [0, 4, 7, 1, 2, 3, 5, 6]), # move forward ([0, 5, 6], 3, [1, 2, 0, 5, 6, 3, 4, 7]), # move in between ([1, 3, 5, 7], 3, [0, 2, 1, 3, 5, 7, 4, 6]), # same as above ([0, 2, 3, 2, 3], 6, [1, 4, 5, 0, 2, 3, 6, 7]), # strip dupe indices ] OTHER_INDICES: list[tuple] = [ ([7, 4], 1, [0, 7, 4, 1, 2, 3, 5, 6]), # move forward reorder ([3, 0, 2], 6, [1, 4, 5, 3, 0, 2, 6, 7]), # move back reorder ((2, 4), -2, [0, 1, 3, 5, 6, 2, 4, 7]), # negative indexing ([slice(None, 3)], 6, [3, 4, 5, 0, 1, 2, 6, 7]), # move slice back ([slice(5, 8)], 2, [0, 1, 5, 6, 7, 2, 3, 4]), # move slice forward ([slice(1, 8, 2)], 3, [0, 2, 1, 3, 5, 7, 4, 6]), # move slice between ([slice(None, 8, 3)], 4, [1, 2, 0, 3, 6, 4, 5, 7]), ([slice(None, 8, 3), 0, 3, 6], 4, [1, 2, 0, 3, 6, 4, 5, 7]), ] MOVING_INDICES = BASIC_INDICES + OTHER_INDICES @pytest.mark.parametrize("sources, dest, expectation", MOVING_INDICES) def test_move_multiple(sources, dest, expectation): """Test the that we can move objects with the move method""" el = EventedList(range(8)) el.events = Mock(wraps=el.events) assert el == [0, 1, 2, 3, 4, 5, 6, 7] def _fail(): raise AssertionError("unexpected event called") el.events.removing.connect(_fail) el.events.removed.connect(_fail) el.events.inserting.connect(_fail) el.events.inserted.connect(_fail) el.move_multiple(sources, dest) assert el == expectation el.events.moving.emit.assert_called() el.events.moved.emit.assert_called() el.events.reordered.emit.assert_called() def test_move_multiple_mimics_slice_reorder(): """Test the that move_multiple provides the same result as slice insertion.""" data = list(range(8)) el = EventedList(data) el.events = Mock(wraps=el.events) assert el == data new_order = [1, 5, 3, 4, 6, 7, 2, 0] # this syntax el.move_multiple(new_order, 0) # is the same as this syntax data[:] = [data[i] for i in new_order] assert el == new_order assert el == data assert el.events.moving.emit.call_args_list == [ call(1, 0), call(5, 1), call(4, 2), call(5, 3), call(6, 4), call(7, 5), call(7, 6), ] assert el.events.moved.emit.call_args_list == [ call(1, 0, 1), call(5, 1, 5), call(4, 2, 3), call(5, 3, 4), call(6, 4, 6), call(7, 5, 7), call(7, 6, 2), ] el.events.reordered.emit.assert_called() # move_multiple also works omitting the insertion index el[:] = list(range(8)) expected = [el[i] for i in new_order] el.move_multiple(new_order) assert el == expected def test_child_events(): """Test that evented lists bubble child events.""" # create a random object that emits events class E: test = Signal(str) e_obj = E() root: EventedList[E] = EventedList(child_events=True) mock = Mock() root.events.all.connect(mock) root.append(e_obj) assert len(e_obj.test) == 1 assert root == [e_obj] e_obj.test.emit("hi") assert mock.call_count == 3 expected = [ call(EmissionInfo(root.events.inserting, (0,), path=(PathStep(index=0),))), call(EmissionInfo(root.events.inserted, (0, e_obj), path=(PathStep(index=0),))), call( EmissionInfo( e_obj.test, ("hi",), path=(PathStep(index=0), PathStep(attr="test")) ) ), ] mock.assert_has_calls(expected) del root[0] assert len(e_obj.test) == 0 def test_child_events_groups(): """Test that evented lists bubble child events.""" # create a random object that emits events class Group(SignalGroup): test = Signal(str) test2 = Signal(str) class E: def __init__(self): self.events = Group(self) e_obj = E() root: EventedList[E] = EventedList(child_events=True) mock = Mock() root.events.all.connect(mock) root.append(e_obj) assert root == [e_obj] e_obj.events.test2.emit("hi") assert [c[0][0].signal.name for c in mock.call_args_list] == [ "inserting", "inserted", "test2", # This is now the direct child signal, not child_event ] # when an object in the list owns an emitter group, then any emitter in that group # will also be detected, and the child event will be emitted directly with path info expected = [ call(EmissionInfo(root.events.inserting, (0,), path=(PathStep(index=0),))), call(EmissionInfo(root.events.inserted, (0, e_obj), path=(PathStep(index=0),))), call(EmissionInfo(e_obj.events.test2, ("hi",), path=(PathStep(index=0),))), ] # note that we can get back to the actual object in the list using the .instance # attribute on signal instances. assert e_obj.events.test2.instance.all.instance == e_obj mock.assert_has_calls(expected) def test_copy_no_sync(): l1 = EventedList([1, 2, 3]) l2 = copy(l1) l1.append(4) assert len(l2) == 3 psygnal-0.15.0/tests/containers/test_evented_proxy.py0000644000000000000000000001230315073705675020031 0ustar00from unittest.mock import Mock, call import numpy as np from psygnal import EmissionInfo, SignalGroup from psygnal.containers import ( EventedCallableObjectProxy, EventedObjectProxy, _evented_proxy, ) from psygnal.utils import monitor_events def test_evented_proxy(): class T: def __init__(self) -> None: self.x = 1 self.f = "f" self._list = [0, 1] def __getitem__(self, key): return self._list[key] def __setitem__(self, key, value): self._list[key] = value def __delitem__(self, key): del self._list[key] t = EventedObjectProxy(T()) assert "events" in dir(t) assert t.x == 1 mock = Mock() with monitor_events(t.events, mock): t.x = 2 t.f = "f" del t.x t.y = "new" t[0] = 7 t[0] = 7 # no event del t[0] assert mock.call_args_list == [ call(EmissionInfo(t.events.attribute_set, ("x", 2))), call(EmissionInfo(t.events.attribute_deleted, ("x",))), call(EmissionInfo(t.events.attribute_set, ("y", "new"))), call(EmissionInfo(t.events.item_set, (0, 7))), call(EmissionInfo(t.events.item_deleted, (0,))), ] def test_evented_proxy_ref(): class T: def __init__(self) -> None: self.x = 1 assert not _evented_proxy._OBJ_CACHE t = EventedObjectProxy(T()) assert not _evented_proxy._OBJ_CACHE assert isinstance(t.events, SignalGroup) # this will actually create the group assert len(_evented_proxy._OBJ_CACHE) == 1 del t # this should clean up the object from the cache assert not _evented_proxy._OBJ_CACHE def test_in_place_proxies(): # fmt: off class T: x = 0 def __iadd__(self, other): return self def __isub__(self, other): return self def __imul__(self, other): return self def __imatmul__(self, other): return self def __itruediv__(self, other): return self def __ifloordiv__(self, other): return self def __imod__(self, other): return self def __ipow__(self, other): return self def __ilshift__(self, other): return self def __irshift__(self, other): return self def __iand__(self, other): return self def __ixor__(self, other): return self def __ior__(self, other): return self # fmt: on t = EventedObjectProxy(T()) mock = Mock() with monitor_events(t.events, mock): t += 1 mock.assert_called_with(EmissionInfo(t.events.in_place, ("add", 1))) t -= 2 mock.assert_called_with(EmissionInfo(t.events.in_place, ("sub", 2))) t *= 3 mock.assert_called_with(EmissionInfo(t.events.in_place, ("mul", 3))) t /= 4 mock.assert_called_with(EmissionInfo(t.events.in_place, ("truediv", 4))) t //= 5 mock.assert_called_with(EmissionInfo(t.events.in_place, ("floordiv", 5))) t @= 6 mock.assert_called_with(EmissionInfo(t.events.in_place, ("matmul", 6))) t %= 7 mock.assert_called_with(EmissionInfo(t.events.in_place, ("mod", 7))) t **= 8 mock.assert_called_with(EmissionInfo(t.events.in_place, ("pow", 8))) t <<= 9 mock.assert_called_with(EmissionInfo(t.events.in_place, ("lshift", 9))) t >>= 10 mock.assert_called_with(EmissionInfo(t.events.in_place, ("rshift", 10))) t &= 11 mock.assert_called_with(EmissionInfo(t.events.in_place, ("and", 11))) t ^= 12 mock.assert_called_with(EmissionInfo(t.events.in_place, ("xor", 12))) t |= 13 mock.assert_called_with(EmissionInfo(t.events.in_place, ("or", 13))) def test_numpy_proxy() -> None: ary = np.ones((4, 4)) t: EventedObjectProxy[np.ndarray] = EventedObjectProxy(ary) assert repr(t) == repr(ary) mock = Mock() with monitor_events(t.events, mock): t[0] = 2 info = next(iter(mock.call_args))[0] assert isinstance(info, EmissionInfo) signal = info.signal (key, value) = info.args assert signal.name == "item_set" assert key == 0 assert np.array_equal(value, [2, 2, 2, 2]) mock.reset_mock() t[2:] = np.arange(8).reshape(2, 4) info = next(iter(mock.call_args))[0] assert isinstance(info, EmissionInfo) signal = info.signal (key, value) = info.args assert signal.name == "item_set" assert key == slice(2, None, None) assert np.array_equal(value, [[0, 1, 2, 3], [4, 5, 6, 7]]) mock.reset_mock() t += 1 t -= 1 assert np.array_equal( t, np.asarray([[2, 2, 2, 2], [1, 1, 1, 1], [0, 1, 2, 3], [4, 5, 6, 7]]) ) t *= [0, 0, 0, 0] assert not t.any() assert mock.call_args_list == [ call(EmissionInfo(t.events.in_place, ("add", 1))), call(EmissionInfo(t.events.in_place, ("sub", 1))), call(EmissionInfo(t.events.in_place, ("mul", [0, 0, 0, 0]))), ] def test_evented_callable_proxy(): calls = [] def f(*args, **kwargs): calls.append((args, kwargs)) ef = EventedCallableObjectProxy(f) ef(1, 2, foo="bar") assert calls == [((1, 2), {"foo": "bar"})] psygnal-0.15.0/tests/containers/test_evented_set.py0000644000000000000000000001052715073705675017451 0ustar00from copy import copy from unittest.mock import Mock, call import pytest from psygnal.containers import EventedOrderedSet, EventedSet, OrderedSet @pytest.fixture def regular_set(): return set(range(5)) @pytest.fixture(params=[EventedSet, EventedOrderedSet]) def test_set(request, regular_set): return request.param(regular_set) @pytest.mark.parametrize( "meth", [ # METHOD, ARGS, EXPECTED EVENTS # primary interface ("add", 2, []), ("add", 10, [call((10,), ())]), ("discard", 2, [call((), (2,))]), ("remove", 2, [call((), (2,))]), ("discard", 10, []), # parity with set ("update", {3, 4, 5, 6}, [call((5, 6), ())]), ("difference_update", {3, 4, 5, 6}, [call((), (3, 4))]), ("intersection_update", {3, 4, 5, 6}, [call((), (0, 1, 2))]), ("symmetric_difference_update", {3, 4, 5, 6}, [call((5, 6), (3, 4))]), ], ids=lambda x: x[0], ) def test_set_interface_parity(test_set: EventedSet, regular_set: set, meth): method_name, arg, expected = meth mock = Mock() test_set.events.items_changed.connect(mock) test_set_method = getattr(test_set, method_name) assert tuple(test_set) == tuple(regular_set) regular_set_method = getattr(regular_set, method_name) assert test_set_method(arg) == regular_set_method(arg) assert tuple(test_set) == tuple(regular_set) mock.assert_has_calls(expected) assert type(test_set).__name__ in repr(test_set) def test_set_pop(test_set: EventedSet): mock = Mock() test_set.events.items_changed.connect(mock) npops = len(test_set) while test_set: test_set.pop() assert mock.call_count == npops with pytest.raises(KeyError): test_set.pop() with pytest.raises(KeyError): test_set.remove(34) def test_set_clear(test_set: EventedSet): mock = Mock() test_set.events.items_changed.connect(mock) mock.assert_not_called() test_set.clear() mock.assert_called_once_with((), (0, 1, 2, 3, 4)) @pytest.mark.parametrize( "meth", [ ("difference", {3, 4, 5, 6}), ("intersection", {3, 4, 5, 6}), ("issubset", {3, 4}), ("issubset", {3, 4, 5, 6}), ("issubset", {1, 2, 3, 4, 5, 6}), ("issuperset", {3, 4}), ("issuperset", {3, 4, 5, 6}), ("issuperset", {1, 2, 3, 4, 5, 6}), ("symmetric_difference", {3, 4, 5, 6}), ("union", {3, 4, 5, 6}), ], ) def test_set_new_objects(test_set: EventedSet, regular_set: set, meth): method_name, arg = meth test_set_method = getattr(test_set, method_name) assert tuple(test_set) == tuple(regular_set) mock = Mock() test_set.events.items_changed.connect(mock) regular_set_method = getattr(regular_set, method_name) result = test_set_method(arg) assert result == regular_set_method(arg) assert isinstance(result, (EventedSet, EventedOrderedSet, bool)) assert result is not test_set mock.assert_not_called() def test_ordering(): tup = (24, 16, 8, 4, 5, 6) s_tup = set(tup) os_tup = OrderedSet(tup) assert tuple(s_tup) != tup assert repr(s_tup) == "{4, 5, 6, 8, 16, 24}" assert tuple(os_tup) == tup assert repr(os_tup) == "OrderedSet((24, 16, 8, 4, 5, 6))" os_tup.discard(8) os_tup.add(8) assert tuple(os_tup) == (24, 16, 4, 5, 6, 8) def test_copy(test_set): from copy import copy assert test_set.copy() == copy(test_set) assert test_set is not copy(test_set) assert isinstance(copy(test_set), type(test_set)) def test_repr(test_set): if isinstance(test_set, EventedOrderedSet): assert repr(test_set) == "EventedOrderedSet((0, 1, 2, 3, 4))" else: assert repr(test_set) == "EventedSet({0, 1, 2, 3, 4})" def test_copy_no_sync(): s1 = EventedSet([1, 2, 3]) s2 = copy(s1) s1.add(4) assert len(s2) == 3 def test_set_emission_order(): s = EventedSet() def callback1(): if 1 not in s: s.add(1) def callback2(): if 5 not in s: s.update(range(5, 10)) s.events.items_changed.connect(callback1) s.events.items_changed.connect(callback2) mock = Mock() s.events.items_changed.connect(mock) s.add(11) mock.assert_has_calls( [ call((11,), ()), call((1,), ()), call((5, 6, 7, 8, 9), ()), ] ) psygnal-0.15.0/tests/containers/test_nested_containers.py0000644000000000000000000000431615073705675020652 0ustar00from dataclasses import dataclass, field from typing import ClassVar, cast from unittest.mock import Mock import pytest from psygnal import EmissionInfo, PathStep, SignalGroupDescriptor, evented from psygnal.containers import EventedDict, EventedList, EventedSet EL2 = EventedList([10, 20, 30]) EL = EventedList([1, 2, EL2]) # TODO: re-emit events from nested lists ED: EventedDict[str, int] = EventedDict({"a": 1, "b": 2}) ES = EventedSet({"x", "y", "z"}) @evented @dataclass class A: x: int = 1 @pytest.fixture def EventedClass(): @dataclass class M: events: ClassVar[SignalGroupDescriptor] = SignalGroupDescriptor( connect_child_events=True ) a: EventedList = field(default_factory=lambda: EL.copy()) b: EventedDict = field(default_factory=lambda: ED.copy()) c: EventedSet = field(default_factory=lambda: ES.copy()) d: A = field(default_factory=A) return M @pytest.mark.parametrize( "expr, expect", [ # list element reassignment ("m.a[1] = 12", ((1, 2, 12), (PathStep(attr="a"), PathStep(index=1)))), # list attribute replaced wholesale ("m.a = [0, 2]", (([0, 2], EL), (PathStep(attr="a"),))), # dict item updated ("m.b['a'] = 3", (("a", 1, 3), (PathStep(attr="b"), PathStep(key="a")))), # dataclass attribute replaced ("m.b = {'x': 11}", (({"x": 11}, ED), (PathStep(attr="b"),))), # set mutated ("m.c.add('w')", ((("w",), ()), (PathStep(attr="c"),))), # set attribute replaced wholesale (r"m.c = {1}", (({1}, ES), (PathStep(attr="c"),))), # nested dataclass field change ("m.d.x = 12", ((12, 1), (PathStep(attr="d"), PathStep(attr="x")))), # dataclass attribute replaced wholesale ("m.d = {'x': 1}", (({"x": 1}, A(x=1)), (PathStep(attr="d"),))), # evented object replaced with another evented object ("m.d = A(x=99)", ((A(x=99), A(x=1)), (PathStep(attr="d"),))), ], ) def test_nested_containers(expr, expect, EventedClass): m = EventedClass() mock = Mock() m.events.connect(mock) exec(expr) info = cast("EmissionInfo", mock.call_args[0][0]) assert (info.args, info.path) == expect psygnal-0.15.0/tests/containers/test_selectable_evented_list.py0000644000000000000000000001054515073705675022014 0ustar00from unittest.mock import Mock import pytest from psygnal.containers import SelectableEventedList @pytest.fixture def regular_list() -> list: return list(range(5)) @pytest.fixture def test_list(regular_list: list) -> SelectableEventedList: test_list = SelectableEventedList(regular_list) test_list.events = Mock(wraps=test_list.events) test_list.selection.events = Mock(wraps=test_list.selection.events) return test_list def test_select_item_not_in_list(test_list: SelectableEventedList) -> None: """Items not in list should not be added to selection.""" with pytest.raises(ValueError): test_list.selection.add(6) assert 6 not in test_list.selection def test_newly_selected_item_is_active(test_list: SelectableEventedList) -> None: """Items added to a selection should become active.""" test_list.selection.clear() test_list.selection.add(1) assert test_list.selection.active == 1 def test_select_all(test_list: SelectableEventedList) -> None: """Select all should populate the selection.""" test_list.selection.update = Mock(wraps=test_list.selection.update) test_list.selection.clear() assert not test_list.selection test_list.select_all() assert all(el in test_list.selection for el in range(5)) test_list.selection.update.assert_called_once() def test_deselect_all(test_list: SelectableEventedList) -> None: """Deselect all should clear the selection""" test_list.selection.clear = Mock(wraps=test_list.selection.clear) test_list.selection = list(range(5)) assert all(el in test_list.selection for el in range(5)) test_list.deselect_all() assert not test_list.selection test_list.selection.clear.assert_called_once() @pytest.mark.parametrize( "initial_selection, step, expand_selection, wraparound, expected", [ ({0}, 1, False, False, {1}), ({0}, 2, False, False, {2}), ({0}, 2, True, False, {0, 2}), ({0, 1}, 1, False, False, {1}), ({0}, 5, False, False, {4}), ({0}, 5, False, True, {0}), ({}, 1, False, False, {4}), ], ) def test_select_next( test_list: SelectableEventedList, initial_selection, step, expand_selection, wraparound, expected, ): """Test select next method behaviour.""" test_list.selection = initial_selection test_list.select_next( step=step, expand_selection=expand_selection, wraparound=wraparound ) assert test_list.selection == expected def test_select_next_with_empty_list(): """Selection should remain unchanged on advancing if list is empty.""" test_list = SelectableEventedList([]) initial_selection = test_list.selection.copy() test_list.select_next() assert test_list.selection == initial_selection @pytest.mark.parametrize( "initial_selection, expand_selection, wraparound, expected", [ ({1}, False, False, {0}), ({0}, False, False, {0}), ({1}, True, False, {0, 1}), ({1, 2}, False, False, {0}), ({0}, False, True, {4}), ], ) def test_select_previous( test_list, initial_selection, expand_selection, wraparound, expected ): """Test select next method behaviour.""" test_list.selection = initial_selection test_list.select_previous(expand_selection=expand_selection, wraparound=wraparound) assert test_list.selection == expected def test_item_discarded_from_selection_on_removal_from_list( test_list: SelectableEventedList, ) -> None: """Check that items removed from a list are also removed from the selection.""" test_list.selection.clear() test_list.selection.discard = Mock(wraps=test_list.selection.discard) test_list.selection = {0} assert 0 in test_list.selection test_list.remove(0) assert 0 not in test_list.selection test_list.selection.discard.assert_called_once() def test_remove_selected(test_list: SelectableEventedList) -> None: """Test items are removed from both the selection and the list.""" test_list.selection.clear() initial_selection = {0, 1} test_list.selection = initial_selection assert test_list.selection == initial_selection output = test_list.remove_selected() assert set(output) == initial_selection assert all(el not in test_list for el in initial_selection) assert all(el not in test_list.selection for el in initial_selection) assert test_list.selection == {2} psygnal-0.15.0/tests/containers/test_selection.py0000644000000000000000000000565715073705675017141 0ustar00from unittest.mock import Mock from psygnal.containers import Selection def test_add_and_remove_from_selection(): selection = Selection() selection.events._current = Mock() assert not selection._current assert not selection selection.add(1) selection._current = 1 selection.events._current.emit.assert_called_once() assert 1 in selection assert selection._current == 1 selection.remove(1) assert not selection def test_update_active_called_on_selection_change(): selection = Selection() selection._update_active = Mock() selection.add(1) selection._update_active.assert_called_once() def test_active_event_emitted_on_selection_change(): selection = Selection() selection.events.active = Mock() assert not selection.active selection.add(1) assert selection.active == 1 selection.events.active.emit.assert_called_once() def test_current_setter(): """Current event should only emit if value changes.""" selection = Selection() selection._current = 1 selection.events._current = Mock() selection._current = 1 selection.events._current.emit.assert_not_called() selection._current = 2 selection.events._current.emit.assert_called_once() def test_active_setter(): """Active setter should make value the only selected item, make it current and emit the active event.""" selection = Selection() selection.events.active = Mock() assert not selection._current selection.active = 1 assert selection.active == 1 assert selection._current == 1 selection.events.active.emit.assert_called_once() def test_select_only(): mock = Mock() selection = Selection([1, 2]) selection.active = 1 assert selection.active == 1 selection.events.items_changed.connect(mock) selection.select_only(2) mock.assert_called_once_with((2,), (1,)) assert selection.active == 2 def test_clear(): selection = Selection([1, 2]) selection._current = 2 assert len(selection) == 2 selection.clear(keep_current=True) assert len(selection) == 0 assert selection._current == 2 selection.clear(keep_current=False) assert selection._current is None def test_toggle(): selection = Selection() selection.symmetric_difference_update = Mock() selection.toggle(1) selection.symmetric_difference_update.assert_called_once() def test_emit_change(): """emit change is overridden to also update the active value.""" selection = Selection() selection._update_active = Mock() selection._emit_change((None,), (None,)) selection._update_active.assert_called_once() def test_hash(): assert hash(Selection()) def test_replace_selection(): mock = Mock() selection = Selection([1, 2, 3]) selection.events.items_changed.connect(mock) selection.replace_selection([3, 4, 5]) mock.assert_called_once_with((4, 5), (1, 2)) assert set(selection) == {3, 4, 5} psygnal-0.15.0/.gitignore0000644000000000000000000000247015073705675012212 0ustar00.idea/ # Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so *.c # temporarily disabled mypyc files *.so_BAK *.pyd_BAK # Distribution / packaging .Python env/ build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ *.egg-info/ .installed.cfg *.egg # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover .hypothesis/ .pytest_cache/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder target/ # Jupyter Notebook .ipynb_checkpoints # pyenv .python-version # celery beat schedule file celerybeat-schedule # SageMath parsed files *.sage.py # dotenv .env # virtualenv .venv venv/ ENV/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ # IDE settings .vscode/ _version.py psygnal/_version.py .asv/ wheelhouse/ # for now... uv.lock psygnal-0.15.0/LICENSE0000644000000000000000000000270715073705675011232 0ustar00Copyright (c) 2021, Talley Lambert Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. psygnal-0.15.0/README.md0000644000000000000000000001156615073705675011507 0ustar00# psygnal [![License](https://img.shields.io/pypi/l/psygnal.svg?color=green)](https://github.com/pyapp-kit/psygnal/raw/master/LICENSE) [![PyPI](https://img.shields.io/pypi/v/psygnal.svg?color=green)](https://pypi.org/project/psygnal) [![Conda](https://img.shields.io/conda/v/conda-forge/psygnal)](https://github.com/conda-forge/psygnal-feedstock) [![Python Version](https://img.shields.io/pypi/pyversions/psygnal.svg?color=green)](https://python.org) [![CI](https://github.com/pyapp-kit/psygnal/actions/workflows/test.yml/badge.svg)](https://github.com/pyapp-kit/psygnal/actions/workflows/test.yml) [![codecov](https://codecov.io/gh/pyapp-kit/psygnal/branch/main/graph/badge.svg?token=qGnz9GXpEb)](https://codecov.io/gh/pyapp-kit/psygnal) [![Documentation Status](https://readthedocs.org/projects/psygnal/badge/?version=latest)](https://psygnal.readthedocs.io/en/latest/?badge=latest) [![Benchmarks](https://img.shields.io/badge/⏱-codspeed-%23FF7B53)](https://codspeed.io/pyapp-kit/psygnal) Psygnal (pronounced "signal") is a pure python implementation of the [observer pattern](https://en.wikipedia.org/wiki/Observer_pattern), with the API of [Qt-style Signals](https://doc.qt.io/qt-5/signalsandslots.html) with (optional) signature and type checking, and support for threading. It has no dependencies. > This library does ***not*** require or use Qt in any way, It simply implements > a similar observer pattern API. ## Documentation https://psygnal.readthedocs.io/ ### Install ```sh pip install psygnal ``` ```sh conda install -c conda-forge psygnal ``` ## Usage The [observer pattern](https://en.wikipedia.org/wiki/Observer_pattern) is a software design pattern in which an object maintains a list of its dependents ("**observers**"), and notifies them of any state changes – usually by calling a **callback function** provided by the observer. Here is a simple example of using psygnal: ```python from psygnal import Signal class MyObject: # define one or more signals as class attributes value_changed = Signal(str) # create an instance my_obj = MyObject() # You (or others) can connect callbacks to your signals @my_obj.value_changed.connect def on_change(new_value: str): print(f"The value changed to {new_value}!") # The object may now emit signals when appropriate, # (for example in a setter method) my_obj.value_changed.emit('hi') # prints "The value changed to hi!" ``` Much more detail available in the [documentation](https://psygnal.readthedocs.io/)! ### Evented Dataclasses A particularly nice usage of the signal pattern is to emit signals whenever a field of a dataclass changes. Psygnal provides an `@evented` decorator that will emit a signal whenever a field changes. It is compatible with `dataclasses` from [the standard library](https://docs.python.org/3/library/dataclasses.html), as well as [attrs](https://www.attrs.org/en/stable/), and [pydantic](https://pydantic-docs.helpmanual.io): ```python from psygnal import evented from dataclasses import dataclass @evented @dataclass class Person: name: str age: int = 0 person = Person('John', age=30) # connect callbacks @person.events.age.connect def _on_age_change(new_age: str): print(f"Age changed to {new_age}") person.age = 31 # prints: Age changed to 31 ``` See the [dataclass documentation](https://psygnal.readthedocs.io/en/latest/dataclasses/) for more details. ### Evented Containers `psygnal.containers` provides evented versions of mutable data structures (`dict`, `list`, `set`), for cases when you need to monitor mutation: ```python from psygnal.containers import EventedList my_list = EventedList([1, 2, 3, 4, 5]) my_list.events.inserted.connect(lambda i, val: print(f"Inserted {val} at index {i}")) my_list.events.removed.connect(lambda i, val: print(f"Removed {val} at index {i}")) my_list.append(6) # Output: Inserted 6 at index 5 my_list.pop() # Output: Removed 6 at index 5 ``` See the [evented containers documentation](https://psygnal.readthedocs.io/en/latest/API/containers/) for more details. ## Benchmark history https://pyapp-kit.github.io/psygnal/ and https://codspeed.io/pyapp-kit/psygnal ## Developers ### Setup This project uses PEP 735 dependency groups. After cloning, setup your env with `uv sync` or `pip install -e . --group dev` ### Compiling While `psygnal` is a pure python package, it is compiled with mypyc to increase performance. To test the compiled version locally, you can run: ```bash HATCH_BUILD_HOOKS_ENABLE=1 uv sync --force-reinstall ``` (which is also available as `make build` if you have make installed) ### Debugging To disable all compiled files and run the pure python version, you may run: ```bash python -c "import psygnal.utils; psygnal.utils.decompile()" ``` To return the compiled version, run: ```bash python -c "import psygnal.utils; psygnal.utils.recompile()" ``` The `psygnal._compiled` variable will tell you if you're using the compiled version or not. psygnal-0.15.0/pyproject.toml0000644000000000000000000001552315073705675013141 0ustar00# https://peps.python.org/pep-0517/ [build-system] requires = ["hatchling>=1.8.0", "hatch-vcs"] build-backend = "hatchling.build" # https://peps.python.org/pep-0621/ [project] name = "psygnal" description = "Fast python callback/event system modeled after Qt Signals" readme = "README.md" requires-python = ">=3.10" license = { text = "BSD 3-Clause License" } authors = [{ name = "Talley Lambert", email = "talley.lambert@gmail.com" }] classifiers = [ "Development Status :: 5 - Production/Stable", "License :: OSI Approved :: BSD License", "Natural Language :: English", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", "Programming Language :: Python :: 3.13", "Programming Language :: Python :: 3.14", "Typing :: Typed", ] dynamic = ["version"] dependencies = [ # typing-extensions is in the source code, but not actually required # at runtime. All uses are guarded by if TYPE_CHECKING ] # extras # https://peps.python.org/pep-0621/#dependencies-optional-dependencies [project.optional-dependencies] proxy = ["wrapt"] pydantic = ["pydantic"] [dependency-groups] test-min = ["pytest>=6.0", "pytest-cov", "pytest-asyncio"] test = [ { include-group = "test-min" }, "dask[array]>=2024.0.0", "attrs", "numpy >1.21.6", "pydantic", "pyinstaller>=4.0", "wrapt", "msgspec; python_version < '3.14'", "toolz", "anyio", "trio", ] test-typing = [{ include-group = "test-min" }, "pytest-mypy-plugins; python_version < '3.14'"] testqt = [{ include-group = "test" }, "pytest-qt", "qtpy"] test-codspeed = [{ include-group = "test" }, "pytest-codspeed"] docs = [ "mkdocs-api-autonav", "mkdocs-material", "mkdocs-minify-plugin", "mkdocs-spellcheck[all]", "mkdocs", "mkdocstrings-python", "ruff", ] dev = [ { include-group = "test" }, { include-group = "test-typing" }, { include-group = "docs" }, "PyQt6", "ipython", "mypy", "mypy_extensions", "pre-commit", "asv", "ruff", "typing-extensions", "rich>=14.0.0", "pdbpp; sys_platform != 'win32'", "pytest-benchmark>=5.1.0", ] [project.urls] homepage = "https://github.com/pyapp-kit/psygnal" repository = "https://github.com/pyapp-kit/psygnal" documentation = "https://psygnal.readthedocs.io" [project.entry-points.pyinstaller40] hook-dirs = "psygnal._pyinstaller_util._pyinstaller_hook:get_hook_dirs" [tool.hatch.version] source = "vcs" [tool.hatch.build.targets.sdist] include = ["src", "tests", "CHANGELOG.md"] [tool.hatch.build.targets.wheel] only-include = ["src"] sources = ["src"] [tool.hatch.build.targets.wheel.hooks.mypyc] mypy-args = ["--ignore-missing-imports"] enable-by-default = false require-runtime-dependencies = true dependencies = [ "hatch-mypyc>=0.13.0", "mypy", "mypy_extensions >=0.4.2", "pydantic!=2.10.0", # typing error in v2.10 prevents mypyc from working "types-attrs", "msgspec; python_version < '3.14'", ] exclude = [ "src/psygnal/__init__.py", "src/psygnal/_evented_model.py", "src/psygnal/utils.py", "src/psygnal/containers", "src/psygnal/qt.py", "src/psygnal/_pyinstaller_util", "src/psygnal/_throttler.py", "src/psygnal/_async.py", "src/psygnal/testing.py", ] [tool.cibuildwheel] # Skip 32-bit builds & PyPy wheels on all platforms skip = ["*-manylinux_i686", "*-musllinux_i686", "*-win32", "pp*"] build = ["cp310-*", "cp311-*", "cp312-*", "cp313-*", "cp314-*"] test-groups = ["test"] test-command = "pytest {project}/tests -v" test-skip = ["*-musllinux*", "cp312-win*", "*-macosx_arm64"] build-frontend = "build[uv]" [[tool.cibuildwheel.overrides]] select = "*-manylinux_i686*" before-all = "yum install -y python3-devel" [tool.cibuildwheel.environment] HATCH_BUILD_HOOKS_ENABLE = "1" [tool.check-wheel-contents] # W004: Module is not located at importable path (hook-psygnal.py) ignore = ["W004"] # https://docs.astral.sh/ruff/ [tool.ruff] line-length = 88 target-version = "py310" src = ["src", "tests"] [tool.ruff.lint] pydocstyle = { convention = "numpy" } select = [ "E", # style errors "F", # flakes "W", # warnings "D", # pydocstyle "D417", # Missing argument descriptions in Docstrings "I", # isort "UP", # pyupgrade "S", # bandit "C4", # flake8-comprehensions "B", # flake8-bugbear "A001", # flake8-builtins "TC", # flake8-typecheck "TID", # flake8-tidy-imports "RUF", # ruff-specific rules ] ignore = [ "D401", # First line should be in imperative mood ] [tool.ruff.lint.per-file-ignores] "tests/*.py" = ["D", "S", "RUF012"] "benchmarks/*.py" = ["D", "RUF012"] # https://docs.astral.sh/ruff/formatter/ [tool.ruff.format] docstring-code-format = true # https://docs.pytest.org/en/6.2.x/customize.html [tool.pytest.ini_options] minversion = "6.0" testpaths = ["tests"] asyncio_default_fixture_loop_scope = "function" addopts = ["--color=yes"] filterwarnings = [ "error", "ignore:The distutils package is deprecated:DeprecationWarning:", "ignore:.*BackendFinder.find_spec()", # pyinstaller import "ignore:.*not using a cooperative constructor:pytest.PytestDeprecationWarning:", "ignore:The frame locals reference is no longer cached", "ignore:Failed to disconnect::pytestqt", "ignore:.*unclosed.*socket.*:ResourceWarning:", # asyncio internal socket cleanup "ignore:.*unclosed event loop.*:ResourceWarning:", # asyncio internal event loop cleanup ] # https://mypy.readthedocs.io/en/stable/config_file.html [tool.mypy] files = "src/**/*.py" strict = true disallow_any_generics = false disallow_subclassing_any = false show_error_codes = true pretty = true [[tool.mypy.overrides]] module = ["numpy.*", "wrapt", "pydantic.*"] ignore_errors = true [[tool.mypy.overrides]] module = ["wrapt"] ignore_missing_imports = true [[tool.mypy.overrides]] module = ["tests.*"] disallow_untyped_defs = false # https://coverage.readthedocs.io/en/6.4/config.html [tool.coverage.report] exclude_lines = [ "pragma: no cover", "if TYPE_CHECKING:", "@overload", "except ImportError", "\\.\\.\\.", "raise NotImplementedError()", ] show_missing = true [tool.coverage.run] source = ["psygnal"] omit = ["*/_pyinstaller_util/*"] # https://github.com/mgedmin/check-manifest#configuration [tool.check-manifest] ignore = [ ".ruff_cache/**/*", ".github_changelog_generator", ".pre-commit-config.yaml", "tests/**/*", "typesafety/*", ".devcontainer/*", ".readthedocs.yaml", "Makefile", "asv.conf.json", "benchmarks/*", "docs/**/*", "mkdocs.yml", "src/**/*.c", "codecov.yml", "CHANGELOG.md", ] [tool.typos.default] extend-ignore-identifiers-re = ["ser_schema"] psygnal-0.15.0/PKG-INFO0000644000000000000000000001366115073705675011323 0ustar00Metadata-Version: 2.4 Name: psygnal Version: 0.15.0 Summary: Fast python callback/event system modeled after Qt Signals Project-URL: homepage, https://github.com/pyapp-kit/psygnal Project-URL: repository, https://github.com/pyapp-kit/psygnal Project-URL: documentation, https://psygnal.readthedocs.io Author-email: Talley Lambert License: BSD 3-Clause License License-File: LICENSE Classifier: Development Status :: 5 - Production/Stable Classifier: License :: OSI Approved :: BSD License Classifier: Natural Language :: English Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3.13 Classifier: Programming Language :: Python :: 3.14 Classifier: Typing :: Typed Requires-Python: >=3.10 Provides-Extra: proxy Requires-Dist: wrapt; extra == 'proxy' Provides-Extra: pydantic Requires-Dist: pydantic; extra == 'pydantic' Description-Content-Type: text/markdown # psygnal [![License](https://img.shields.io/pypi/l/psygnal.svg?color=green)](https://github.com/pyapp-kit/psygnal/raw/master/LICENSE) [![PyPI](https://img.shields.io/pypi/v/psygnal.svg?color=green)](https://pypi.org/project/psygnal) [![Conda](https://img.shields.io/conda/v/conda-forge/psygnal)](https://github.com/conda-forge/psygnal-feedstock) [![Python Version](https://img.shields.io/pypi/pyversions/psygnal.svg?color=green)](https://python.org) [![CI](https://github.com/pyapp-kit/psygnal/actions/workflows/test.yml/badge.svg)](https://github.com/pyapp-kit/psygnal/actions/workflows/test.yml) [![codecov](https://codecov.io/gh/pyapp-kit/psygnal/branch/main/graph/badge.svg?token=qGnz9GXpEb)](https://codecov.io/gh/pyapp-kit/psygnal) [![Documentation Status](https://readthedocs.org/projects/psygnal/badge/?version=latest)](https://psygnal.readthedocs.io/en/latest/?badge=latest) [![Benchmarks](https://img.shields.io/badge/⏱-codspeed-%23FF7B53)](https://codspeed.io/pyapp-kit/psygnal) Psygnal (pronounced "signal") is a pure python implementation of the [observer pattern](https://en.wikipedia.org/wiki/Observer_pattern), with the API of [Qt-style Signals](https://doc.qt.io/qt-5/signalsandslots.html) with (optional) signature and type checking, and support for threading. It has no dependencies. > This library does ***not*** require or use Qt in any way, It simply implements > a similar observer pattern API. ## Documentation https://psygnal.readthedocs.io/ ### Install ```sh pip install psygnal ``` ```sh conda install -c conda-forge psygnal ``` ## Usage The [observer pattern](https://en.wikipedia.org/wiki/Observer_pattern) is a software design pattern in which an object maintains a list of its dependents ("**observers**"), and notifies them of any state changes – usually by calling a **callback function** provided by the observer. Here is a simple example of using psygnal: ```python from psygnal import Signal class MyObject: # define one or more signals as class attributes value_changed = Signal(str) # create an instance my_obj = MyObject() # You (or others) can connect callbacks to your signals @my_obj.value_changed.connect def on_change(new_value: str): print(f"The value changed to {new_value}!") # The object may now emit signals when appropriate, # (for example in a setter method) my_obj.value_changed.emit('hi') # prints "The value changed to hi!" ``` Much more detail available in the [documentation](https://psygnal.readthedocs.io/)! ### Evented Dataclasses A particularly nice usage of the signal pattern is to emit signals whenever a field of a dataclass changes. Psygnal provides an `@evented` decorator that will emit a signal whenever a field changes. It is compatible with `dataclasses` from [the standard library](https://docs.python.org/3/library/dataclasses.html), as well as [attrs](https://www.attrs.org/en/stable/), and [pydantic](https://pydantic-docs.helpmanual.io): ```python from psygnal import evented from dataclasses import dataclass @evented @dataclass class Person: name: str age: int = 0 person = Person('John', age=30) # connect callbacks @person.events.age.connect def _on_age_change(new_age: str): print(f"Age changed to {new_age}") person.age = 31 # prints: Age changed to 31 ``` See the [dataclass documentation](https://psygnal.readthedocs.io/en/latest/dataclasses/) for more details. ### Evented Containers `psygnal.containers` provides evented versions of mutable data structures (`dict`, `list`, `set`), for cases when you need to monitor mutation: ```python from psygnal.containers import EventedList my_list = EventedList([1, 2, 3, 4, 5]) my_list.events.inserted.connect(lambda i, val: print(f"Inserted {val} at index {i}")) my_list.events.removed.connect(lambda i, val: print(f"Removed {val} at index {i}")) my_list.append(6) # Output: Inserted 6 at index 5 my_list.pop() # Output: Removed 6 at index 5 ``` See the [evented containers documentation](https://psygnal.readthedocs.io/en/latest/API/containers/) for more details. ## Benchmark history https://pyapp-kit.github.io/psygnal/ and https://codspeed.io/pyapp-kit/psygnal ## Developers ### Setup This project uses PEP 735 dependency groups. After cloning, setup your env with `uv sync` or `pip install -e . --group dev` ### Compiling While `psygnal` is a pure python package, it is compiled with mypyc to increase performance. To test the compiled version locally, you can run: ```bash HATCH_BUILD_HOOKS_ENABLE=1 uv sync --force-reinstall ``` (which is also available as `make build` if you have make installed) ### Debugging To disable all compiled files and run the pure python version, you may run: ```bash python -c "import psygnal.utils; psygnal.utils.decompile()" ``` To return the compiled version, run: ```bash python -c "import psygnal.utils; psygnal.utils.recompile()" ``` The `psygnal._compiled` variable will tell you if you're using the compiled version or not.