Compare commits

...

106 Commits

Author SHA1 Message Date
Miguel Grinberg
eb5e249e34 Release 2.3.3 2025-07-01 23:46:00 +01:00
Miguel Grinberg
9bc3dced6c Handle partial reads in WebSocket class (Fixes #294) 2025-06-30 18:32:21 +01:00
Miguel Grinberg
786e5e5337 Additional documentation for the URLPattern class 2025-06-30 18:23:46 +01:00
Ozuba
1d419ce59b Add svg to supported mimetypes (#302) 2025-06-30 12:24:24 +01:00
Miguel Grinberg
7c98c4589d Additional documentation on WebSocket and SSE disconnections 2025-06-28 11:01:22 +01:00
Miguel Grinberg
0f219fd494 fix linter errors #nolog 2025-06-28 10:48:20 +01:00
Miguel Grinberg
e146e2d08d More detailed documentation for current_user 2025-06-28 10:40:59 +01:00
Miguel Grinberg
dc61470fa9 More detailed documentation for route responses 2025-06-28 10:40:30 +01:00
Miguel Grinberg
d7a9c53563 Add a sub-application example 2025-06-20 23:59:04 +01:00
dependabot[bot]
4ddb09ceb3 Bump urllib3 from 2.2.2 to 2.5.0 in /examples/benchmark (#301) #nolog
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.2.2 to 2.5.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.2.2...2.5.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-version: 2.5.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-19 09:20:41 +01:00
Miguel Grinberg
3dffa05ffb Documentation improvements for the Request class 2025-06-18 20:09:59 +01:00
dependabot[bot]
b93a55c9f2 Bump requests from 2.32.0 to 2.32.4 in /examples/benchmark (#300) #nolog
Bumps [requests](https://github.com/psf/requests) from 2.32.0 to 2.32.4.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.32.0...v2.32.4)

---
updated-dependencies:
- dependency-name: requests
  dependency-version: 2.32.4
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-10 10:16:45 +01:00
Miguel Grinberg
f5d3d931ed Support for SSE responses in the test client 2025-05-18 18:26:38 +01:00
Miguel Grinberg
654a85f46b Do not silence exceptions that occur in the SSE task 2025-05-18 12:21:17 +01:00
Miguel Grinberg
3c936a82e0 Version 2.3.3.dev0 2025-05-08 23:11:35 +01:00
Miguel Grinberg
4c0ace1b01 Release 2.3.2 2025-05-08 23:02:29 +01:00
Miguel Grinberg
d9d7ff0825 use async error handlers in auth module (Fixes #298) 2025-05-08 20:07:35 +01:00
dependabot[bot]
7c42a18436 Bump h11 from 0.14.0 to 0.16.0 in /examples/benchmark (#293) #nolog
Bumps [h11](https://github.com/python-hyper/h11) from 0.14.0 to 0.16.0.
- [Commits](https://github.com/python-hyper/h11/compare/v0.14.0...v0.16.0)

---
updated-dependencies:
- dependency-name: h11
  dependency-version: 0.16.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-24 19:04:29 +01:00
Miguel Grinberg
ea84fcb435 Version 2.3.2.dev0 2025-04-13 00:01:21 +01:00
Miguel Grinberg
f30c4733f0 Release 2.3.1 2025-04-13 00:01:12 +01:00
Miguel Grinberg
cd0b3234dd Additional support needed when using orjson 2025-04-12 23:58:48 +01:00
Miguel Grinberg
1f64478957 Version 2.3.1.dev0 2025-04-12 23:33:26 +01:00
Miguel Grinberg
815594fc8b Release 2.3.0 2025-04-12 23:31:54 +01:00
Miguel Grinberg
086f2af3de Use orjson instead of json if available 2025-04-12 23:24:31 +01:00
Miguel Grinberg
f317b15bdb Support optional authentication methods 2025-04-06 23:52:36 +01:00
Miguel Grinberg
b6f232db11 Addressed typing warnings from pyright 2025-04-06 23:52:36 +01:00
Miguel Grinberg
e7ee74d6bb Catch SSL crashes while writing the response (Fixes #206) 2025-03-22 19:02:06 +00:00
dependabot[bot]
847dfd1321 Bump gunicorn from 22.0.0 to 23.0.0 in /examples/benchmark (#291) #nolog
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 22.0.0 to 23.0.0.
- [Release notes](https://github.com/benoitc/gunicorn/releases)
- [Commits](https://github.com/benoitc/gunicorn/compare/22.0.0...23.0.0)

---
updated-dependencies:
- dependency-name: gunicorn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-22 12:41:50 +00:00
Miguel Grinberg
1aa035378e Updates to change log #nolog 2025-03-22 12:40:27 +00:00
Miguel Grinberg
1edfb8daa7 Version 2.2.1.dev0 2025-03-22 12:37:02 +00:00
Miguel Grinberg
9337a2ec9b Release 2.2.0 2025-03-22 12:35:02 +00:00
Miguel Grinberg
11a91a6035 Support for multipart/form-data requests (#287) 2025-03-22 12:24:12 +00:00
Miguel Grinberg
99f65c0198 Additional urldecode tests 2025-03-16 20:39:50 +00:00
Miguel Grinberg
4cc2e95338 Update micropython version used in tests to 1.24.1 2025-03-16 20:34:38 +00:00
Miguel Grinberg
d203df75fe urldecoding should always be done in bytes 2025-03-16 20:32:34 +00:00
dependabot[bot]
00bf535821 Bump jinja2 from 3.1.5 to 3.1.6 in /examples/benchmark (#286) #nolog
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.5 to 3.1.6.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.5...3.1.6)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-06 10:19:46 +00:00
Miguel Grinberg
3bc31f10b2 Simplified urldecode logic 2025-03-03 19:16:18 +00:00
Miguel Grinberg
aa76e6378b Delay route compilation to allow late register_type calls 2025-03-03 19:10:33 +00:00
Miguel Grinberg
c6b99b6d81 Documentation improvements 2025-03-02 19:47:21 +00:00
Miguel Grinberg
953dd94321 Expose the Jinja environment as Template.jinja_env 2025-03-02 11:53:54 +00:00
Miguel Grinberg
68a53a7ae7 Update README #nolog 2025-03-02 00:51:23 +00:00
Miguel Grinberg
c92b5ae282 Redesigned the URL parser to allow for custom path components 2025-03-02 00:48:07 +00:00
dependabot[bot]
48ce31e699 Bump quart from 0.19.7 to 0.20.0 in /examples/benchmark (#283) #nolog
Bumps [quart](https://github.com/pallets/quart) from 0.19.7 to 0.20.0.
- [Release notes](https://github.com/pallets/quart/releases)
- [Changelog](https://github.com/pallets/quart/blob/main/CHANGES.md)
- [Commits](https://github.com/pallets/quart/compare/0.19.7...0.20.0)

---
updated-dependencies:
- dependency-name: quart
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-04 11:19:19 +00:00
dependabot[bot]
6a33e817a2 Bump jinja2 from 3.1.4 to 3.1.5 in /examples/benchmark (#284) #nolog
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.4 to 3.1.5.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.4...3.1.5)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-04 11:18:55 +00:00
Miguel Grinberg
265009ecd6 Version 2.1.1.dev0 2025-02-04 00:35:10 +00:00
Miguel Grinberg
2efbd67878 Release 2.1.0 2025-02-04 00:31:06 +00:00
Miguel Grinberg
d807011ad0 user logins 2025-02-04 00:04:55 +00:00
Miguel Grinberg
675c978797 Basic and token authentication support 2025-02-03 20:00:36 +00:00
Miguel Grinberg
cd87abba30 Mount unit tests 2025-02-03 11:06:26 +00:00
Miguel Grinberg
fd7931e1ae Added Request.url_prefix, Reques.subapp and local mounts 2025-02-03 00:33:59 +00:00
Maxi
d487a73c1e add js to sse example (#281) 2025-01-22 23:42:51 +00:00
Miguel Grinberg
d864b81b65 revert to default funding file #nolog 2025-01-06 17:49:09 +00:00
Miguel Grinberg
d7459f23b2 Version 2.0.8.dev0 2024-11-10 22:57:45 +00:00
Miguel Grinberg
32f5e415e7 Release 2.0.7 2024-11-10 22:57:31 +00:00
Miguel Grinberg
c46e429106 Accept responses with just a status code (Fixes #263) 2024-11-10 20:09:05 +00:00
Miguel Grinberg
4eac013087 Accept responses with just a status code (Fixes #263) 2024-11-10 00:35:21 +00:00
dependabot[bot]
496a288064 Bump werkzeug from 3.0.3 to 3.0.6 in /examples/benchmark (#260) #nolog
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 3.0.3 to 3.0.6.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/3.0.3...3.0.6)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-26 11:37:08 +01:00
dependabot[bot]
bcd876fcae Bump quart from 0.19.4 to 0.19.7 in /examples/benchmark (#259) #nolog
Bumps [quart](https://github.com/pallets/quart) from 0.19.4 to 0.19.7.
- [Release notes](https://github.com/pallets/quart/releases)
- [Changelog](https://github.com/pallets/quart/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/quart/compare/0.19.4...0.19.7)

---
updated-dependencies:
- dependency-name: quart
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-26 00:57:50 +01:00
Stanislav Garanzha
5e5fc5e93e Fix urls in docs (#253)
* Fix 404 external links in intro.rst

* Fix broken links in extensions.rst

`examples/cors/cors.py` does not exist

`uvicorn.org` - failed to resolve
2024-08-17 18:41:43 +01:00
Miguel Grinberg
8895af3737 add tox to dev dependencies 2024-08-15 20:40:54 +01:00
Miguel Grinberg
0a021462e0 Better documentation for start_server() method (Fixes #252) 2024-08-15 19:10:34 +01:00
Lukas Kremla
482ab6d5ca Fixed gzip automatic content-type assignment and added automatic compression header configuration (#251)
* Fixed gzip automatic content-type assignment and added automatic compression setting

This implements the fix for detecting the proper content-type even when the file has the ".gz" extension. It further makes sure the compression headers are set properly if a "gz." file is detected, but the compression headers weren't explicitly set by the user.

* Added a test for properly auto-determining mime types and setting content encoding header

* Modified the gzip file header assignments and following tests according to the feedback.

---------

Co-authored-by: Lukáš Kremla <lukas.kremla@bonnel.cz>
2024-08-14 23:02:23 +01:00
dependabot[bot]
5fe06f6bd5 Bump certifi from 2023.11.17 to 2024.7.4 in /examples/benchmark (#244)
Bumps [certifi](https://github.com/certifi/python-certifi) from 2023.11.17 to 2024.7.4.
- [Commits](https://github.com/certifi/python-certifi/compare/2023.11.17...2024.07.04)

---
updated-dependencies:
- dependency-name: certifi
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-06 11:17:41 +01:00
dependabot[bot]
c170e840ec Bump urllib3 from 2.1.0 to 2.2.2 in /examples/benchmark (#241) #nolog
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.1.0 to 2.2.2.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.1.0...2.2.2)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-19 00:12:28 +01:00
Miguel Grinberg
3a39b47ea8 Version 2.0.7.dev0 2024-06-18 23:14:36 +01:00
Miguel Grinberg
53287217ae Release 2.0.6 2024-06-18 23:14:14 +01:00
Miguel Grinberg
6ffb8a8fe9 Cookie path support in session and test client 2024-06-18 20:56:18 +01:00
Miguel Grinberg
0151611fc8 Configurable session cookie options (Fixes #242) 2024-06-18 00:09:44 +01:00
dependabot[bot]
4204db61e5 Bump jinja2 from 3.1.3 to 3.1.4 in /examples/benchmark (#230) #nolog
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.3 to 3.1.4.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.3...3.1.4)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 11:40:13 +01:00
dependabot[bot]
12438743a8 Bump werkzeug from 3.0.1 to 3.0.3 in /examples/benchmark (#229) #nolog
Bumps [werkzeug](https://github.com/pallets/werkzeug) from 3.0.1 to 3.0.3.
- [Release notes](https://github.com/pallets/werkzeug/releases)
- [Changelog](https://github.com/pallets/werkzeug/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/werkzeug/compare/3.0.1...3.0.3)

---
updated-dependencies:
- dependency-name: werkzeug
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 11:39:51 +01:00
dependabot[bot]
7cbb1edf59 Bump requests from 2.31.0 to 2.32.0 in /examples/benchmark (#232) #nolog
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 11:39:16 +01:00
Miguel Grinberg
dac6df7a7a use codecov token for coverage uploads #nolog 2024-04-28 00:31:53 +01:00
dependabot[bot]
5d6e838f3c Bump gunicorn from 21.2.0 to 22.0.0 in /examples/benchmark (#224) #nolog
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 21.2.0 to 22.0.0.
- [Release notes](https://github.com/benoitc/gunicorn/releases)
- [Commits](https://github.com/benoitc/gunicorn/compare/21.2.0...22.0.0)

---
updated-dependencies:
- dependency-name: gunicorn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-17 23:52:37 +01:00
dependabot[bot]
563bfdc8f5 Bump idna from 3.6 to 3.7 in /examples/benchmark (#223) #nolog
Bumps [idna](https://github.com/kjd/idna) from 3.6 to 3.7.
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](https://github.com/kjd/idna/compare/v3.6...v3.7)

---
updated-dependencies:
- dependency-name: idna
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-12 15:16:52 +01:00
Miguel Grinberg
679d8e63b8 Fix docs build #nolog 2024-03-24 19:56:43 +00:00
Miguel Grinberg
4cb155ee41 Improved cookie support in the test client 2024-03-24 19:45:22 +00:00
Miguel Grinberg
dea79c5ce2 Make Session class more reusable 2024-03-23 16:29:36 +00:00
Carlo Colombo
6b1fd61917 removed outdated import from documentation (Fixes #216) 2024-03-15 11:03:10 +00:00
Miguel Grinberg
f6876c0d15 Use @wraps on decorated functions 2024-03-14 00:16:35 +00:00
Hamsanger
904d5fcaa2 Add event ID to the SSE implementation (#213) 2024-03-10 23:52:43 +00:00
Miguel Grinberg
a0ea439def Add roadmap details to readme 2024-03-10 17:15:14 +00:00
Miguel Grinberg
a1801d9a53 Version 2.0.6.dev0 2024-03-09 10:48:18 +00:00
Miguel Grinberg
14f2c9d345 Release 2.0.5 2024-03-09 10:48:09 +00:00
Miguel Grinberg
d0a4cf8fa7 Handle 0 as an integer argument (Fixes #212) 2024-03-06 20:34:00 +00:00
Miguel Grinberg
901f4e55b8 Version 2.0.5.dev0 2024-02-20 23:15:01 +00:00
Miguel Grinberg
53b28f9938 Release 2.0.4 2024-02-20 23:12:31 +00:00
Miguel Grinberg
f6cba2c0f7 More URLPattern unit tests 2024-02-20 23:09:24 +00:00
Miguel Grinberg
38262c56d3 Do not use regexes for parsing simple URLs (Fixes #207) 2024-02-18 15:05:41 +00:00
dependabot[bot]
a3363c7b8c Bump fastapi from 0.104.1 to 0.109.1 in /examples/benchmark (#203) #nolog
Bumps [fastapi](https://github.com/tiangolo/fastapi) from 0.104.1 to 0.109.1.
- [Release notes](https://github.com/tiangolo/fastapi/releases)
- [Commits](https://github.com/tiangolo/fastapi/compare/0.104.1...0.109.1)

---
updated-dependencies:
- dependency-name: fastapi
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-05 17:43:31 +00:00
Miguel Grinberg
e44c271bae Circuitpython build 2024-01-31 23:43:17 +00:00
Miguel Grinberg
bf519478cb Added documentation on using alternative utemplate loaders 2024-01-14 12:47:28 +00:00
dependabot[bot]
8d1ca808cb Bump jinja2 from 3.1.2 to 3.1.3 in /examples/benchmark (#199) #nolog
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.2 to 3.1.3.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.2...3.1.3)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-11 22:56:46 +00:00
Miguel Grinberg
1f804f869c Version 2.0.4.dev0 2024-01-07 10:50:36 +00:00
Miguel Grinberg
7a6026006f Release 2.0.3 2024-01-07 10:50:28 +00:00
Miguel Grinberg
6712c47400 Pass keyword arguments to thread executor in the correct way (Fixes #195) 2024-01-07 10:44:16 +00:00
Miguel Grinberg
c8c91e8345 Update uasyncio to include new TLS support 2024-01-04 20:41:05 +00:00
Miguel Grinberg
5d188e8c0d Add a limit to WebSocket message size (Fixes #193) 2024-01-03 00:04:02 +00:00
Miguel Grinberg
b80b6b64d0 Documentation improvements 2023-12-28 12:59:19 +00:00
Miguel Grinberg
28007ea583 Version 2.0.3.dev0 2023-12-28 12:11:32 +00:00
Miguel Grinberg
300f8563ed Release 2.0.2 2023-12-28 12:10:46 +00:00
Miguel Grinberg
1fc11193da Support binary data in the SSE extension 2023-12-28 12:04:17 +00:00
Miguel Grinberg
79452a4699 Upgrade micropython tests to use v1.22, initial circuitpython work 2023-12-27 20:39:20 +00:00
Miguel Grinberg
84842e39c3 Improvements to migration guide 2023-12-26 20:00:07 +00:00
Miguel Grinberg
2a3c889717 typo in documentation #nolog 2023-12-26 17:07:20 +00:00
Tak Tran
ad368be993 Remove spurious async in documentation example (#187) 2023-12-23 14:08:12 +00:00
Miguel Grinberg
3df56c6ffe Version 2.0.2.dev0 2023-12-23 12:50:03 +00:00
76 changed files with 4951 additions and 467 deletions

3
.github/FUNDING.yml vendored
View File

@@ -1,3 +0,0 @@
github: miguelgrinberg
patreon: miguelgrinberg
custom: https://paypal.me/miguelgrinberg

View File

@@ -16,6 +16,7 @@ jobs:
- run: python -m pip install --upgrade pip wheel
- run: pip install tox tox-gh-actions
- run: tox -eflake8
- run: tox -edocs
tests:
name: tests
strategy:
@@ -41,6 +42,15 @@ jobs:
- run: python -m pip install --upgrade pip wheel
- run: pip install tox tox-gh-actions
- run: tox -eupy
tests-circuitpython:
name: tests-circuitpython
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v3
- run: python -m pip install --upgrade pip wheel
- run: pip install tox tox-gh-actions
- run: tox -ecpy
coverage:
name: coverage
runs-on: ubuntu-latest
@@ -54,6 +64,7 @@ jobs:
with:
files: ./coverage.xml
fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }}
benchmark:
name: benchmark
runs-on: ubuntu-latest

4
.gitignore vendored
View File

@@ -25,6 +25,8 @@ wheels/
.installed.cfg
*.egg
MANIFEST
requirements.txt
requirements-dev.txt
# PyInstaller
# Usually these files are written by a python script from a template
@@ -90,6 +92,8 @@ venv/
ENV/
env.bak/
venv.bak/
.direnv
.envrc
# Spyder project settings
.spyderproject

View File

@@ -1,5 +1,94 @@
# Microdot change log
**Release 2.3.3** - 2025-07-01
- Handle partial reads in WebSocket class [#294](https://github.com/miguelgrinberg/microdot/issues/294) ([commit](https://github.com/miguelgrinberg/microdot/commit/9bc3dced6c1f582dde0496961d25170b448ad8d7))
- Add SVG to supported mimetypes [#302](https://github.com/miguelgrinberg/microdot/issues/302) ([commit](https://github.com/miguelgrinberg/microdot/commit/1d419ce59bf7006617109c05dc2d6fc6d1dc8235)) (thanks **Ozuba**!)
- Do not silence exceptions that occur in the SSE task ([commit](https://github.com/miguelgrinberg/microdot/commit/654a85f46b7dd7a1e94f81193c4a78a8a1e99936))
- Add Support for SSE responses in the test client ([commit](https://github.com/miguelgrinberg/microdot/commit/f5d3d931edfbacedebf5fdf938ef77c5ee910380))
- Documentation improvements for the `Request` class ([commit](https://github.com/miguelgrinberg/microdot/commit/3dffa05ffb229813156b71e10a85283bdaa26d5e))
- Additional documentation for the `URLPattern` class ([commit](https://github.com/miguelgrinberg/microdot/commit/786e5e533748e1343612c97123773aec9a1a99fc))
- More detailed documentation for route responses ([commit](https://github.com/miguelgrinberg/microdot/commit/dc61470fa959549bb43313906ba6ed9f686babc2))
- Additional documentation on WebSocket and SSE disconnections ([commit](https://github.com/miguelgrinberg/microdot/commit/7c98c4589de4774a88381b393444c75094532550))
- More detailed documentation for `current_user` ([commit](https://github.com/miguelgrinberg/microdot/commit/e146e2d08deddf9b924c7657f04db28d71f34221))
- Add a sub-application example ([commit](https://github.com/miguelgrinberg/microdot/commit/d7a9c535639268e415714b12ac898ae38e516308))
**Release 2.3.2** - 2025-05-08
- Use async error handlers in auth module [#298](https://github.com/miguelgrinberg/microdot/issues/298) ([commit](https://github.com/miguelgrinberg/microdot/commit/d9d7ff0825e4c5fbed6564d3684374bf3937df11))
**Release 2.3.1** - 2025-04-13
- Additional support needed when using `orjson` ([commit](https://github.com/miguelgrinberg/microdot/commit/cd0b3234ddb0c8ff4861d369836ec2aed77494db))
**Release 2.3.0** - 2025-04-12
- Support optional authentication methods ([commit](https://github.com/miguelgrinberg/microdot/commit/f317b15bdbf924007e5e3414e0c626baccc3ede6))
- Catch SSL exceptions while writing the response [#206](https://github.com/miguelgrinberg/microdot/issues/206) ([commit](https://github.com/miguelgrinberg/microdot/commit/e7ee74d6bba74cfd89b9ddc38f28e02514eb1791))
- Use `orjson` instead of `json` if available ([commit](https://github.com/miguelgrinberg/microdot/commit/086f2af3deab86d4340f3f1feb9e019de59f351d))
- Addressed typing warnings from pyright ([commit](https://github.com/miguelgrinberg/microdot/commit/b6f232db1125045d79c444c736a2ae59c5501fdd))
**Release 2.2.0** - 2025-03-22
- Support for `multipart/form-data` requests [#287](https://github.com/miguelgrinberg/microdot/issues/287) ([commit](https://github.com/miguelgrinberg/microdot/commit/11a91a60350518e426b557fae8dffe75912f8823))
- Support custom path components in URLs ([commit #1](https://github.com/miguelgrinberg/microdot/commit/c92b5ae28222af5a1094f5d2f70a45d4d17653d5) [commit #2](https://github.com/miguelgrinberg/microdot/commit/aa76e6378b37faab52008a8aab8db75f81b29323))
- Expose the Jinja environment as `Template.jinja_env` ([commit](https://github.com/miguelgrinberg/microdot/commit/953dd9432122defe943f0637bbe7e01f2fc7743f))
- Simplified urldecode logic ([commit #1](https://github.com/miguelgrinberg/microdot/commit/3bc31f10b2b2d4460c62366013278d87665f0f97) [commit #2](https://github.com/miguelgrinberg/microdot/commit/d203df75fef32c7cc0fe7cc6525e77522b37a289))
- Additional urldecode tests ([commit](https://github.com/miguelgrinberg/microdot/commit/99f65c0198590c0dfb402c24685b6f8dfba1935d))
- Documentation improvements ([commit](https://github.com/miguelgrinberg/microdot/commit/c6b99b6d8117d4e40e16d5b953dbf4deb023d24d))
- Update micropython version used in tests to 1.24.1 ([commit](https://github.com/miguelgrinberg/microdot/commit/4cc2e95338a7de3b03742389004147ee21285621))
**Release 2.1.0** - 2025-02-04
- User login support ([commit](https://github.com/miguelgrinberg/microdot/commit/d807011ad006e53e70c4594d7eac04d03bb08681))
- Basic and token authentication support ([commit](https://github.com/miguelgrinberg/microdot/commit/675c9787974da926af446974cd96ef224e0ee27f))
- Added `local` argument to the `app.mount()` method, to define sub-application specific before and after request handlers ([commit](https://github.com/miguelgrinberg/microdot/commit/fd7931e1aec173c60f81dad18c1a102ed8f0e081))
- Added `Request.url_prefix`, `Request.subapp` and local mounts ([commit](https://github.com/miguelgrinberg/microdot/commit/fd7931e1aec173c60f81dad18c1a102ed8f0e081))
- Added a front end to the SSE example [#281](https://github.com/miguelgrinberg/microdot/issues/281) ([commit](https://github.com/miguelgrinberg/microdot/commit/d487a73c1ea5b3467e23907618b348ca52e0235c)) (thanks **Maxi**!)
- Additional ``app.mount()`` unit tests ([commit](https://github.com/miguelgrinberg/microdot/commit/cd87abba30206ec6d3928e0aabacb2fccf7baf70))
**Release 2.0.7** - 2024-11-10
- Accept responses with just a status code [#263](https://github.com/miguelgrinberg/microdot/issues/263) ([commit #1](https://github.com/miguelgrinberg/microdot/commit/4eac013087f807cafa244b8a6b7b0ed4c82ff150) [commit #2](https://github.com/miguelgrinberg/microdot/commit/c46e4291061046f1be13f300dd08645b71c16635))
- Fixed compressed file content-type assignment [#251](https://github.com/miguelgrinberg/microdot/issues/251) ([commit](https://github.com/miguelgrinberg/microdot/commit/482ab6d5ca068d71ea6301f45918946161e9fcc1)) (thanks **Lukas Kremla**!)
- Better documentation for start_server[#252](https://github.com/miguelgrinberg/microdot/issues/252) ([commit](https://github.com/miguelgrinberg/microdot/commit/0a021462e0c42c249d587a2d600f5a21a408adfc))
- Fix URLs in documentation [#253](https://github.com/miguelgrinberg/microdot/issues/253) ([commit](https://github.com/miguelgrinberg/microdot/commit/5e5fc5e93e11cbf6e3dc8036494e8732d1815d3e)) (thanks **Stanislav Garanzha**!)
**Release 2.0.6** - 2024-06-18
- Add event ID to the SSE implementation [#213](https://github.com/miguelgrinberg/microdot/issues/213) ([commit](https://github.com/miguelgrinberg/microdot/commit/904d5fcaa2d19d939a719b8e68c4dee3eb470739)) (thanks **Hamsanger**!)
- Configurable session cookie options [#242](https://github.com/miguelgrinberg/microdot/issues/242) ([commit](https://github.com/miguelgrinberg/microdot/commit/0151611fc84fec450820d673f4c4d70c32c990a7))
- Improved cookie support in the test client ([commit](https://github.com/miguelgrinberg/microdot/commit/4cb155ee411dc2d9c9f15714cb32b25ba79b156a))
- Cookie path support in session extension and test client ([commit](https://github.com/miguelgrinberg/microdot/commit/6ffb8a8fe920111c4d8c16e98715a0d5ee2d1da3))
- Refactor `Session` class to make it more reusable ([commit](https://github.com/miguelgrinberg/microdot/commit/dea79c5ce224dec7858ffef45a42bed442fd3a5a))
- Use `@functools.wraps` on decorated functions ([commit](https://github.com/miguelgrinberg/microdot/commit/f6876c0d154adcae96098405fb6a1fdf1ea4ec28))
- Removed outdated import from documentation [#216](https://github.com/miguelgrinberg/microdot/issues/216) ([commit](https://github.com/miguelgrinberg/microdot/commit/6b1fd6191702e7a9ad934fddfcdd0a3cebea7c94)) (thanks **Carlo Colombo**!)
- Add roadmap details to readme ([commit](https://github.com/miguelgrinberg/microdot/commit/a0ea439def238084c4d68309c0992b66ffd28ad6))
**Release 2.0.5** - 2024-03-09
- Correct handling of 0 as an integer argument (regression from #207) [#212](https://github.com/miguelgrinberg/microdot/issues/212) ([commit](https://github.com/miguelgrinberg/microdot/commit/d0a4cf8fa7dfb1da7466157b18d3329a8cf9a5df))
**Release 2.0.4** - 2024-02-20
- Do not use regexes for parsing simple URLs [#207](https://github.com/miguelgrinberg/microdot/issues/207) ([commit #1](https://github.com/miguelgrinberg/microdot/commit/38262c56d34784401659639b482a4a1224e1e59a) [commit #2](https://github.com/miguelgrinberg/microdot/commit/f6cba2c0f7e18e2f32b5adb779fb037b6c473eab))
- Added documentation on using alternative uTemplate loaders ([commit](https://github.com/miguelgrinberg/microdot/commit/bf519478cbc6e296785241cd7d01edb23c317cd3))
- Added CircuitPython builds ([commit](https://github.com/miguelgrinberg/microdot/commit/e44c271bae88f4327d3eda16d8780ac264d1ebab))
**Release 2.0.3** - 2024-01-07
- Add a limit to WebSocket message size [#193](https://github.com/miguelgrinberg/microdot/issues/193) ([commit](https://github.com/miguelgrinberg/microdot/commit/5d188e8c0ddef6ce633ca702dbdd4a90f2799597))
- Pass keyword arguments to thread executor in the correct way [#195](https://github.com/miguelgrinberg/microdot/issues/195) ([commit](https://github.com/miguelgrinberg/microdot/commit/6712c47400d7c426c88032f65ab74466524eccab))
- Update uasyncio library used in tests to include new TLS support ([commit](https://github.com/miguelgrinberg/microdot/commit/c8c91e83457d24320f22c9a74e80b15e06b072ca))
- Documentation improvements ([commit](https://github.com/miguelgrinberg/microdot/commit/b80b6b64d02d21400ca8a5077f5ed1127cc202ae))
**Release 2.0.2** - 2023-12-28
- Support binary data in the SSE extension ([commit](https://github.com/miguelgrinberg/microdot/commit/1fc11193da0d298f5539e2ad218836910a13efb2))
- Upgrade micropython tests to use v1.22 + initial CircuitPython testing work ([commit](https://github.com/miguelgrinberg/microdot/commit/79452a46992351ccad2c0317c20bf50be0d76641))
- Improvements to migration guide ([commit](https://github.com/miguelgrinberg/microdot/commit/84842e39c360a8b3ddf36feac8af201fb19bbb0b))
- Remove spurious async in documentation example [#187](https://github.com/miguelgrinberg/microdot/issues/187) ([commit](https://github.com/miguelgrinberg/microdot/commit/ad368be993e2e3007579f1d3880e36d60c71da92)) (thanks **Tak Tran**!)
**Release 2.0.1** - 2023-12-23
- Addressed some inadvertent mistakes in the template extensions ([commit](https://github.com/miguelgrinberg/microdot/commit/bd18ceb4424e9dfb52b1e6d498edd260aa24fc53))

View File

@@ -32,10 +32,25 @@ describes the backwards incompatible changes that were made.
## Resources
- Documentation
- [Stable](https://microdot.readthedocs.io/en/stable/)
- [Latest](https://microdot.readthedocs.io/en/latest/)
- Still using version 1?
- [Code](https://github.com/miguelgrinberg/microdot/tree/v1)
- [Documentation](https://microdot.readthedocs.io/en/v1/)
- [Change Log](https://github.com/miguelgrinberg/microdot/blob/main/CHANGES.md)
- Documentation
- [Latest](https://microdot.readthedocs.io/en/latest/)
- [Stable (v2)](https://microdot.readthedocs.io/en/stable/)
- [Legacy (v1)](https://microdot.readthedocs.io/en/v1/) ([Code](https://github.com/miguelgrinberg/microdot/tree/v1))
## Roadmap
The following features are planned for future releases of Microdot, both for
MicroPython and CPython:
- Authentication support, similar to [Flask-Login](https://github.com/maxcountryman/flask-login) for Flask (**Added in version 2.1**)
- Support for forms encoded in `multipart/form-data` format (**Added in version 2.2**)
- OpenAPI integration, similar to [APIFairy](https://github.com/miguelgrinberg/apifairy) for Flask
In addition to the above, the following extensions are also under consideration,
but only for CPython:
- Database integration through [SQLAlchemy](https://github.com/sqlalchemy/sqlalchemy)
- Socket.IO support through [python-socketio](https://github.com/miguelgrinberg/python-socketio)
Do you have other ideas to propose? Let's [discuss them](https://github.com/:miguelgrinberg/microdot/discussions/new?category=ideas)!

BIN
bin/circuitpython Executable file

Binary file not shown.

Binary file not shown.

View File

@@ -1,8 +1,8 @@
API Reference
=============
``microdot`` module
-------------------
Core API
--------
.. autoclass:: microdot.Microdot
:members:
@@ -13,52 +13,82 @@ API Reference
.. autoclass:: microdot.Response
:members:
.. autoclass:: microdot.URLPattern
:members:
``websocket`` extension
-----------------------
Multipart Forms
---------------
.. automodule:: microdot.multipart
:members:
WebSocket
---------
.. automodule:: microdot.websocket
:members:
``utemplate`` templating extension
----------------------------------
Server-Sent Events (SSE)
------------------------
.. automodule:: microdot.sse
:members:
Templates (uTemplate)
---------------------
.. automodule:: microdot.utemplate
:members:
``jinja`` templating extension
------------------------------
Templates (Jinja)
-----------------
.. automodule:: microdot.jinja
:members:
``session`` extension
---------------------
User Sessions
-------------
.. automodule:: microdot.session
:members:
``cors`` extension
------------------
Authentication
--------------
.. automodule:: microdot.auth
:inherited-members:
:special-members: __call__
:members:
User Logins
-----------
.. automodule:: microdot.login
:inherited-members:
:special-members: __call__
:members:
Cross-Origin Resource Sharing (CORS)
------------------------------------
.. automodule:: microdot.cors
:members:
``test_client`` extension
-------------------------
Test Client
-----------
.. automodule:: microdot.test_client
:members:
``asgi`` extension
------------------
ASGI
----
.. autoclass:: microdot.asgi.Microdot
:members:
:exclude-members: shutdown, run
``wsgi`` extension
-------------------
WSGI
----
.. autoclass:: microdot.wsgi.Microdot
:members:

View File

@@ -5,8 +5,82 @@ Microdot is a highly extensible web application framework. The extensions
described in this section are maintained as part of the Microdot project in
the same source code repository.
WebSocket Support
~~~~~~~~~~~~~~~~~
Multipart Forms
~~~~~~~~~~~~~~~
.. list-table::
:align: left
* - Compatibility
- | CPython & MicroPython
* - Required Microdot source files
- | `multipart.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/multipart.py>`_
| `helpers.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/helpers.py>`_
* - Required external dependencies
- | None
* - Examples
- | `formdata.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/uploads/formdata.py>`_
The multipart extension handles multipart forms, including those that have file
uploads.
The :func:`with_form_data <microdot.multipart.with_form_data>` decorator
provides the simplest way to work with these forms. With this decorator added
to the route, whenever the client sends a multipart request the
:attr:`request.form <microdot.Request.form>` and
:attr:`request.files <microdot.Request.files>` properties are populated with
the submitted data. For form fields the field values are always strings. For
files, they are instances of the
:class:`FileUpload <microdot.multipart.FileUpload>` class.
Example::
from microdot.multipart import with_form_data
@app.post('/upload')
@with_form_data
async def upload(request):
print('form fields:', request.form)
print('files:', request.files)
One disadvantage of the ``@with_form_data`` decorator is that it has to copy
any uploaded files to memory or temporary disk files, depending on their size.
The :attr:`FileUpload.max_memory_size <microdot.multipart.FileUpload.max_memory_size>`
attribute can be used to control the cutoff size above which a file upload
is transferred to a temporary file.
A more performant alternative to the ``@with_form_data`` decorator is the
:class:`FormDataIter <microdot.multipart.FormDataIter>` class, which iterates
over the form fields sequentially, giving the application the option to parse
the form fields on the fly and decide what to copy and what to discard. When
using ``FormDataIter`` the ``request.form`` and ``request.files`` attributes
are not used.
Example::
from microdot.multipart import FormDataIter
@app.post('/upload')
async def upload(request):
async for name, value in FormDataIter(request):
print(name, value)
For fields that contain an uploaded file, the ``value`` returned by the
iterator is the same ``FileUpload`` instance. The application can choose to
save the file with the :meth:`save() <microdot.multipart.FileUpload.save>`
method, or read it with the :meth:`read() <microdot.multipart.FileUpload.read>`
method, optionally passing a size to read it in chunks. The
:meth:`copy() <microdot.multipart.FileUpload.copy>` method is also available to
apply the copying logic used by the ``@with_form_data`` decorator, which is
inefficient but allows the file to be set aside to be processed later, after
the remaining form fields.
WebSocket
~~~~~~~~~
.. list-table::
:align: left
@@ -16,6 +90,7 @@ WebSocket Support
* - Required Microdot source files
- | `websocket.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/websocket.py>`_
| `helpers.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/helpers.py>`_
* - Required external dependencies
- | None
@@ -32,15 +107,44 @@ messages respectively.
Example::
@app.route('/echo')
@with_websocket
async def echo(request, ws):
from microdot.websocket import with_websocket
@app.route('/echo')
@with_websocket
async def echo(request, ws):
while True:
message = await ws.receive()
await ws.send(message)
To end the WebSocket connection, the route handler can exit, without returning
anything::
@app.route('/echo')
@with_websocket
async def echo(request, ws):
while True:
message = await ws.receive()
if message == 'exit':
break
await ws.send(message)
await ws.send('goodbye')
If the client ends the WebSocket connection from their side, the route function
is cancelled. The route function can catch the ``CancelledError`` exception
from asyncio to perform cleanup tasks::
@app.route('/echo')
@with_websocket
async def echo(request, ws):
try:
while True:
message = await ws.receive()
await ws.send(message)
except asyncio.CancelledError:
print('Client disconnected!')
Server-Sent Events Support
~~~~~~~~~~~~~~~~~~~~~~~~~~
Server-Sent Events
~~~~~~~~~~~~~~~~~~
.. list-table::
:align: left
@@ -50,6 +154,7 @@ Server-Sent Events Support
* - Required Microdot source files
- | `sse.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/sse.py>`_
| `helpers.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/helpers.py>`_
* - Required external dependencies
- | None
@@ -65,6 +170,8 @@ asynchronous method to send an event to the client.
Example::
from microdot.sse import with_sse
@app.route('/events')
@with_sse
async def events(request, sse):
@@ -73,13 +180,32 @@ Example::
await sse.send({'counter': i}) # unnamed event
await sse.send('end', event='comment') # named event
To end the SSE connection, the route handler can exit, without returning
anything, as shown in the above examples.
If the client ends the SSE connection from their side, the route function is
cancelled. The route function can catch the ``CancelledError`` exception from
asyncio to perform cleanup tasks::
@app.route('/events')
@with_sse
async def events(request, sse):
try:
i = 0
while True:
await asyncio.sleep(1)
await sse.send({'counter': i})
i += 1
except asyncio.CancelledError:
print('Client disconnected!')
.. note::
The SSE protocol is unidirectional, so there is no ``receive()`` method in
the SSE object. For bidirectional communication with the client, use the
WebSocket extension.
Rendering Templates
~~~~~~~~~~~~~~~~~~~
Templates
~~~~~~~~~
Many web applications use HTML templates for rendering content to clients.
Microdot includes extensions to render templates with the
@@ -134,6 +260,21 @@ method::
Template.initialize('my_templates')
By default templates are automatically compiled the first time they are
rendered, or when their last modified timestamp is more recent than the
compiledo file's timestamp. This loading behavior can be changed by switching
to a different template loader. For example, if the templates are pre-compiled,
the timestamp check and compile steps can be removed by switching to the
"compiled" template loader::
from utemplate import compiled
from microdot.utemplate import Template
Template.initialize(loader_class=compiled.Loader)
Consult the `uTemplate documentation <https://github.com/pfalcon/utemplate>`_
for additional information regarding template loaders.
Using the Jinja Engine
^^^^^^^^^^^^^^^^^^^^^^
@@ -187,8 +328,8 @@ must be used.
.. note::
The Jinja extension is not compatible with MicroPython.
Maintaining Secure User Sessions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Secure User Sessions
~~~~~~~~~~~~~~~~~~~~
.. list-table::
:align: left
@@ -198,6 +339,7 @@ Maintaining Secure User Sessions
* - Required Microdot source files
- | `session.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/session.py>`_
| `helpers.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/helpers.py>`_
* - Required external dependencies
- | CPython: `PyJWT <https://pyjwt.readthedocs.io/>`_
@@ -255,6 +397,205 @@ The :func:`save() <microdot.session.SessionDict.save>` and
:func:`delete() <microdot.session.SessionDict.delete>` methods are used to update
and destroy the user session respectively.
Authentication
~~~~~~~~~~~~~~
.. list-table::
:align: left
* - Compatibility
- | CPython & MicroPython
* - Required Microdot source files
- | `auth.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/auth.py>`_
* - Required external dependencies
- | None
* - Examples
- | `basic_auth.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/auth/basic_auth.py>`_
| `token_auth.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/auth/token_auth.py>`_
The authentication extension provides helper classes for two commonly used
authentication patterns, described below.
Basic Authentication
^^^^^^^^^^^^^^^^^^^^
`Basic Authentication <https://en.wikipedia.org/wiki/Basic_access_authentication>`_
is a method of authentication that is part of the HTTP specification. It allows
clients to authenticate to a server using a username and a password. Web
browsers have native support for Basic Authentication and will automatically
prompt the user for a username and a password when a protected resource is
accessed.
To use Basic Authentication, create an instance of the :class:`BasicAuth <microdot.auth.BasicAuth>`
class::
from microdot.auth import BasicAuth
auth = BasicAuth(app)
Next, create an authentication function. The function must accept a request
object and a username and password pair provided by the user. If the
credentials are valid, the function must return an object that represents the
user. If the authentication function cannot validate the user provided
credentials it must return ``None``. Decorate the function with
``@auth.authenticate``::
@auth.authenticate
async def verify_user(request, username, password):
user = await load_user_from_database(username)
if user and user.verify_password(password):
return user
To protect a route with authentication, add the ``auth`` instance as a
decorator::
@app.route('/')
@auth
async def index(request):
return f'Hello, {request.g.current_user}!'
While running an authenticated request, the user object returned by the
authenticaction function is accessible as ``request.g.current_user``.
If an endpoint is intended to work with or without authentication, then it can
be protected with the ``auth.optional`` decorator::
@app.route('/')
@auth.optional
async def index(request):
if request.g.current_user:
return f'Hello, {request.g.current_user}!'
else:
return 'Hello, anonymous user!'
As shown in the example, a route can check ``request.g.current_user`` to
determine if the user is authenticated or not.
Token Authentication
^^^^^^^^^^^^^^^^^^^^
To set up token authentication, create an instance of
:class:`TokenAuth <microdot.auth.TokenAuth>`::
from microdot.auth import TokenAuth
auth = TokenAuth()
Then add a function that verifies the token and returns the user it belongs to,
or ``None`` if the token is invalid or expired::
@auth.authenticate
async def verify_token(request, token):
return load_user_from_token(token)
As with Basic authentication, the ``auth`` instance is used as a decorator to
protect your routes, and the authenticated user is accessible from the request
object as ``request.g.current_user``::
@app.route('/')
@auth
async def index(request):
return f'Hello, {request.g.current_user}!'
Optional authentication can also be used with tokens::
@app.route('/')
@auth.optional
async def index(request):
if request.g.current_user:
return f'Hello, {request.g.current_user}!'
else:
return 'Hello, anonymous user!'
User Logins
~~~~~~~~~~~
.. list-table::
:align: left
* - Compatibility
- | CPython & MicroPython
* - Required Microdot source files
- | `login.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/auth.py>`_
| `session.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/session.py>`_
| `helpers.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/helpers.py>`_
* - Required external dependencies
- | CPython: `PyJWT <https://pyjwt.readthedocs.io/>`_
| MicroPython: `jwt.py <https://github.com/micropython/micropython-lib/blob/master/python-ecosys/pyjwt/jwt.py>`_,
`hmac.py <https://github.com/micropython/micropython-lib/blob/master/python-stdlib/hmac/hmac.py>`_
* - Examples
- | `login.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/login/login.py>`_
The login extension provides user login functionality. The logged in state of
the user is stored in the user session cookie, and an optional "remember me"
cookie can also be added to keep the user logged in across browser sessions.
To use this extension, create instances of the
:class:`Session <microdot.session.Session>` and :class:`Login <microdot.login.Login>`
class::
Session(app, secret_key='top-secret!')
login = Login()
The ``Login`` class accept an optional argument with the URL of the login page.
The default for this URL is */login*.
The application must represent users as objects with an ``id`` attribute. A
function decorated with ``@login.user_loader`` is used to load a user object::
@login.user_loader
async def get_user(user_id):
return database.get_user(user_id)
The application must implement the login form. At the point in which the user
credentials have been received and verified, a call to the
:func:`login_user() <microdot.login.Login.login_user>` function must be made to
record the user in the user session::
@app.route('/login', methods=['GET', 'POST'])
async def login(request):
# ...
if user.check_password(password):
return await login.login_user(request, user, remember=remember_me)
return redirect('/login')
The optional ``remember`` argument is used to add a remember me cookie that
will log the user in automatically in future sessions. A value of ``True`` will
keep the log in active for 30 days. Alternatively, an integer number of days
can be passed in this argument.
Any routes that require the user to be logged in must be decorated with
:func:`@login <microdot.login.Login.__call__>`::
@app.route('/')
@login
async def index(request):
# ...
Routes that are of a sensitive nature can be decorated with
:func:`@login.fresh <microdot.login.Login.fresh>`
instead. This decorator requires that the user has logged in during the current
session, and will ask the user to logged in again if the session was
authenticated through a remember me cookie::
@app.get('/fresh')
@login.fresh
async def fresh(request):
# ...
To log out a user, the :func:`logout_user() <microdot.auth.Login.logout_user>`
is used::
@app.post('/logout')
@login
async def logout(request):
await login.logout_user(request)
return redirect('/')
Cross-Origin Resource Sharing (CORS)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -271,7 +612,7 @@ Cross-Origin Resource Sharing (CORS)
- | None
* - Examples
- | `cors.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/cors/cors.py>`_
- | `app.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/cors/app.py>`_
The CORS extension provides support for `Cross-Origin Resource Sharing
(CORS) <https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS>`_. CORS is a
@@ -290,8 +631,8 @@ Example::
cors = CORS(app, allowed_origins=['https://example.com'],
allow_credentials=True)
Testing with the Test Client
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Test Client
~~~~~~~~~~~
.. list-table::
:align: left
@@ -327,8 +668,8 @@ Example::
See the documentation for the :class:`TestClient <microdot.test_client.TestClient>`
class for more details.
Deploying on a Production Web Server
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Production Deployments
~~~~~~~~~~~~~~~~~~~~~~
The ``Microdot`` class creates its own simple web server. This is enough for an
application deployed with MicroPython, but when using CPython it may be useful
@@ -348,7 +689,7 @@ Using an ASGI Web Server
- | `asgi.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot/asgi.py>`_
* - Required external dependencies
- | An ASGI web server, such as `Uvicorn <https://uvicorn.org/>`_.
- | An ASGI web server, such as `Uvicorn <https://www.uvicorn.org/>`_.
* - Examples
- | `hello_asgi.py <https://github.com/miguelgrinberg/microdot/blob/main/examples/hello/hello_asgi.py>`_

View File

@@ -1,5 +1,8 @@
Cross-Compiling and Freezing Microdot (MicroPython Only)
--------------------------------------------------------
Cross-Compiling and Freezing Microdot
-------------------------------------
.. note::
This section only applies when using Microdot on MicroPython.
Microdot is a fairly small framework, so its size is not something you need to
be concerned about unless you are working with MicroPython on hardware with a
@@ -36,7 +39,7 @@ Cross-Compiling
An issue that is common with low-end microcontroller boards is that they do not
have enough RAM for the MicroPython compiler to compile the source files, but
once the code is compiled they are able to run it without problems.
once the code is compiled they are able to run it just fine.
To address this, MicroPython allows you to cross-compile source files on your
desktop or laptop computer and then upload their compiled versions to the
@@ -82,8 +85,8 @@ imported directly from the device's ROM, leaving more RAM available for
application use.
The process to create a custom firmware is unfortunately non-trivial and
different depending on the device, so you will need to consult the MicroPython
documentation that applies to your device to learn how to do this.
different for each microcontroller platform, so you will need to consult the
MicroPython documentation that applies to your device to learn how to do this.
The part of the process that is common to all devices is the creation of a
`manifest file <https://docs.micropython.org/en/latest/reference/manifest.html>`_

View File

@@ -25,14 +25,15 @@ and incorporated into a custom MicroPython firmware.
Use the following guidelines to know what files to copy:
- For a minimal setup with only the base web server functionality, copy
* For a minimal setup with only the base web server functionality, copy
`microdot.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/microdot.py>`_
into your project.
- For a configuration that includes one or more optional extensions, create a
* For a configuration that includes one or more optional extensions, create a
*microdot* directory in your device and copy the following files:
- `__init__.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/__init__.py>`_
- `microdot.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/microdot.py>`_
- any needed `extensions <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot>`_.
* `__init__.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/__init__.py>`_
* `microdot.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/microdot.py>`_
* any needed `extensions <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot>`_.
Getting Started
@@ -81,8 +82,34 @@ handler functions can be defined as ``async def`` or ``def`` functions, but
``async def`` functions are recommended for performance.
The :func:`run() <microdot.Microdot.run>` method starts the application's web
server on port 5000 by default. This method blocks while it waits for
connections from clients.
server on port 5000 by default, and creates its own asynchronous loop. This
method blocks while it waits for connections from clients.
For some applications it may be necessary to run the web server alongside other
asynchronous tasks, on an already running loop. In that case, instead of
``app.run()`` the web server can be started by invoking the
:func:`start_server() <microdot.Microdot.start_server>` coroutine as shown in
the following example::
import asyncio
from microdot import Microdot
app = Microdot()
@app.route('/')
async def index(request):
return 'Hello, world!'
async def main():
# start the server in a background task
server = asyncio.create_task(app.start_server())
# ... do other asynchronous work here ...
# cleanup before ending the application
await server
asyncio.run(main())
Running with CPython
^^^^^^^^^^^^^^^^^^^^
@@ -91,7 +118,7 @@ Running with CPython
:align: left
* - Required Microdot source files
- | `microdot.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot.py>`_
- | `microdot.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/microdot.py>`_
* - Required external dependencies
- | None
@@ -117,7 +144,7 @@ Running with MicroPython
:align: left
* - Required Microdot source files
- | `microdot.py <https://github.com/miguelgrinberg/microdot/tree/main/src/microdot.py>`_
- | `microdot.py <https://github.com/miguelgrinberg/microdot/blob/main/src/microdot/microdot.py>`_
* - Required external dependencies
- | None
@@ -144,8 +171,9 @@ changed by passing the ``port`` argument to the ``run()`` method.
Web Server Configuration
^^^^^^^^^^^^^^^^^^^^^^^^
The :func:`run() <microdot.Microdot.run>` method supports a few arguments to
configure the web server.
The :func:`run() <microdot.Microdot.run>` and
:func:`start_server() <microdot.Microdot.start_server>` methods support a few
arguments to configure the web server.
- ``port``: The port number to listen on. Pass the desired port number in this
argument to use a port different than the default of 5000. For example::
@@ -171,10 +199,8 @@ configure the web server.
app.run(port=4443, debug=True, ssl=sslctx)
.. note::
The ``ssl`` argument can only be used with CPython at this time, because
MicroPython's asyncio module does not currently support SSL certificates or
TLS encryption. Work on this is
`in progress <https://github.com/micropython/micropython/pull/11897>`_.
When using CPython, the certificate and key files must be given in PEM
format. When using MicroPython, these files must be given in DER format.
Defining Routes
~~~~~~~~~~~~~~~
@@ -297,21 +323,58 @@ match and the route will not be called.
A special type ``path`` can be used to capture the remainder of the path as a
single argument. The difference between an argument of type ``path`` and one of
type ``string`` is that the latter stops capturing when a ``/`` appears in the
URL.
URL::
@app.get('/tests/<path:path>')
async def get_test(request, path):
return 'Test: ' + path
For the most control, the ``re`` type allows the application to provide a
custom regular expression for the dynamic component. The next example defines
a route that only matches usernames that begin with an upper or lower case
letter, followed by a sequence of letters or numbers::
The ``re`` type allows the application to provide a custom regular expression
for the dynamic component. The next example defines a route that only matches
usernames that begin with an upper or lower case letter, followed by a sequence
of letters or numbers::
@app.get('/users/<re:[a-zA-Z][a-zA-Z0-9]*:username>')
async def get_user(request, username):
return 'User: ' + username
The ``re`` type returns the URL component as a string, which sometimes may not
be the most convenient. To convert a path component to something more
meaningful than a string, the application can register a custom URL component
type and provide a parser function that performs the conversion. In the
following example, a ``hex`` custom type is registered to automatically
convert hex numbers given in the path to numbers::
from microdot import URLPattern
URLPattern.register_type('hex', parser=lambda value: int(value, 16))
@app.get('/users/<hex:user_id>')
async def get_user(request, user_id):
user = get_user_by_id(user_id)
# ...
In addition to the parser, the custom URL component can include a pattern,
given as a regular expression. When a pattern is provided, the URL component
will only match if the regular expression matches the value passed in the URL.
The ``hex`` example above can be expanded with a pattern as follows::
URLPattern.register_type('hex', pattern='[0-9a-fA-F]+',
parser=lambda value: int(value, 16))
In cases where a pattern isn't provided, or when the pattern is unable to
filter out all invalid values, the parser function can return ``None`` to
indicate a failed match. The next example shows how the parser for the ``hex``
type can be expanded to do that::
def hex_parser(value):
try:
return int(value, 16)
except ValueError:
return None
URLPattern.register_type('hex', parser=hex_parser)
.. note::
Dynamic path components are passed to route functions as keyword arguments,
so the names of the function arguments must match the names declared in the
@@ -419,7 +482,7 @@ Mounting a Sub-Application
^^^^^^^^^^^^^^^^^^^^^^^^^^
Small Microdot applications can be written as a single source file, but this
is not the best option for applications that past a certain size. To make it
is not the best option for applications that pass a certain size. To make it
simpler to write large applications, Microdot supports the concept of
sub-applications that can be "mounted" on a larger application, possibly with
a common URL prefix applied to all of its routes. For developers familiar with
@@ -462,7 +525,7 @@ the sub-applications to build the larger combined application::
from customers import customers_app
from orders import orders_app
async def create_app():
def create_app():
app = Microdot()
app.mount(customers_app, url_prefix='/customers')
app.mount(orders_app, url_prefix='/orders')
@@ -475,11 +538,25 @@ The resulting application will have the customer endpoints available at
*/customers/* and the order endpoints available at */orders/*.
.. note::
Before-request, after-request and error handlers defined in the
sub-application are also copied over to the main application at mount time.
Once installed in the main application, these handlers will apply to the
whole application and not just the sub-application in which they were
created.
During the handling of a request, the
:attr:`Request.url_prefix <microdot.Microdot.url_prefix>` attribute is
set to the URL prefix under which the sub-application was mounted, or an
empty string if the endpoint did not come from a sub-application or the
sub-application was mounted without a URL prefix. It is possible to issue a
redirect that is relative to the sub-application as follows::
return redirect(request.url_prefix + '/relative-url')
When mounting an application as shown above, before-request, after-request and
error handlers defined in the sub-application are copied over to the main
application at mount time. Once installed in the main application, these
handlers will apply to the whole application and not just the sub-application
in which they were created.
The :func:`mount() <microdot.Microdot.mount>` method has a ``local`` argument
that defaults to ``False``. When this argument is set to ``True``, the
before-request, after-request and error handlers defined in the sub-application
will only apply to the sub-application.
Shutting Down the Server
^^^^^^^^^^^^^^^^^^^^^^^^
@@ -524,6 +601,13 @@ The request object provides access to the request attributes, including:
specified by the client, or ``None`` if no content type was specified.
- :attr:`content_length <microdot.Request.content_length>`: The content
length of the request, or 0 if no content length was specified.
- :attr:`json <microdot.Request.json>`: The parsed JSON data in the request
body. See :ref:`below <JSON Payloads>` for additional details.
- :attr:`form <microdot.Request.form>`: The parsed form data in the request
body, as a dictionary. See :ref:`below <Form Data>` for additional details.
- :attr:`files <microdot.Request.files>`: A dictionary with the file uploads
included in the request body. Note that file uploads are only supported when
the :ref:`Multipart Forms` extension is used.
- :attr:`client_addr <microdot.Request.client_addr>`: The network address of
the client, as a tuple (host, port).
- :attr:`app <microdot.Request.app>`: The application instance that created the
@@ -550,8 +634,8 @@ to use this attribute::
The client must set the ``Content-Type`` header to ``application/json`` for
the ``json`` attribute of the request object to be populated.
URLEncoded Form Data
^^^^^^^^^^^^^^^^^^^^
Form Data
^^^^^^^^^
The request object also supports standard HTML form submissions through the
:attr:`form <microdot.Request.form>` attribute, which presents the form data
@@ -565,9 +649,10 @@ as a :class:`MultiDict <microdot.MultiDict>` object. Example::
return f'Hello {name}'
.. note::
Form submissions are only parsed when the ``Content-Type`` header is set by
the client to ``application/x-www-form-urlencoded``. Form submissions using
the ``multipart/form-data`` content type are currently not supported.
Form submissions automatically parsed when the ``Content-Type`` header is
set by the client to ``application/x-www-form-urlencoded``. For form
submissions that use the ``multipart/form-data`` content type the
:ref:`Multipart Forms` extension must be used.
Accessing the Raw Request Body
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -672,15 +757,18 @@ sections describe the different types of responses that are supported.
The Three Parts of a Response
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Route functions can return one, two or three values. The first or only value is
always returned to the client in the response body::
Route functions can return one, two or three values. The first and most
important value is the response body::
@app.get('/')
async def index(request):
return 'Hello, World!'
In the above example, Microdot issues a standard 200 status code response, and
inserts default headers.
In the above example, Microdot issues a standard 200 status code response
indicating a successful request. The body of the response is the
``'Hello, World!'`` string returned by the function. Microdot includes default
headers with this response, including the ``Content-Type`` header set to
``text/plain`` to indicate a response in plain text.
The application can provide its own status code as a second value returned from
the route to override the 200 default. The example below returns a 202 status
@@ -692,22 +780,30 @@ code::
The application can also return a third value, a dictionary with additional
headers that are added to, or replace the default ones included by Microdot.
The next example returns an HTML response, instead of a default text response::
The next example returns an HTML response, instead of the default plain text
response::
@app.get('/')
async def index(request):
return '<h1>Hello, World!</h1>', 202, {'Content-Type': 'text/html'}
If the application needs to return custom headers, but does not need to change
the default status code, then it can return two values, omitting the status
code::
If the application does not need to return a body, then it can omit it and
have the status code as the first or only returned value::
@app.get('/')
async def index(request):
return 204
Likewise, if the application needs to return a body and custom headers, but
does not need to change the default status code, then it can return two values,
omitting the status code::
@app.get('/')
async def index(request):
return '<h1>Hello, World!</h1>', {'Content-Type': 'text/html'}
The application can also return a :class:`Response <microdot.Response>` object
containing all the details of the response as a single value.
Lastly, the application can also return a :class:`Response <microdot.Response>`
object containing all the details of the response as a single value.
JSON Responses
^^^^^^^^^^^^^^
@@ -855,18 +951,36 @@ Another option is to create a response object directly in the route function::
Concurrency
~~~~~~~~~~~
Microdot implements concurrency through the ``asyncio`` package. Applications
must ensure their handlers do not block, as this will prevent other concurrent
requests from being handled.
Microdot implements concurrency through the ``asyncio`` package, which means
that applications must be careful to prevent blocking in their handlers.
When running under CPython, ``async def`` handler functions run as native
asyncio tasks, while ``def`` handler functions are executed in a
`thread executor <https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.loop.run_in_executor>`_
to prevent them from blocking the asynchronous loop.
"async def" handlers
^^^^^^^^^^^^^^^^^^^^
The recommendation for route handlers in Microdot is to use asynchronous
functions, declared as ``async def``. Microdot executes these handler
functions as native asynchronous tasks. The standard considerations for writing
asynchronous code apply, and in particular blocking calls should be avoided to
ensure the application runs smoothly and is always responsive.
"def" handlers
^^^^^^^^^^^^^^
Microdot also supports the use of synchronous route handlers, declared as
standard ``def`` functions. These handlers are handled differently under
CPython and MicroPython.
When running on CPython, Microdot executes synchronous handlers in a
`thread executor <https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.loop.run_in_executor>`_,
which uses a thread pool. The use of blocking or CPU intensive code in these
handlers does not have such a negative effect on the application, because
handlers do not run on the same thread as the asynchronous loop. On the other
hand, the application will be affected by threading issues such as those caused
by the Global Interpreter Lock.
Under MicroPython the situation is different. Most microcontroller boards
implementing MicroPython do not have threading support or executors, so ``def``
handler functions in this platform can only run in the main and only thread.
These functions will block the asynchronous loop when they take too long to
complete so ``async def`` handlers properly written to allow other handlers to
run in parallel should be preferred.
do not have or have very limited threading support, so Microdot executes
synchronous handlers in the main and often only thread available. This means
that these functions will block the asynchronous loop when they take too long
to complete. The use of properly written asynchronous handlers should be
preferred.

View File

@@ -39,7 +39,7 @@ extension.
Any applications built using the asyncio extension will need to update their
imports from this::
from microdot.asyncio import Microdot
from microdot_asyncio import Microdot
to this::
@@ -94,7 +94,7 @@ as a single string::
Streamed templates also have an asynchronous version::
return await Template('index.html').generate_async(title='Home')
return Template('index.html').generate_async(title='Home')
Class-based user sessions
~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -138,5 +138,8 @@ deployed with standard WSGI servers such as Gunicorn.
WebSocket support when using the WSGI extension is enabled when using a
compatible web server. At this time only Gunicorn is supported for WebSocket.
Given that WebSocket support is asynchronous, it would be better to switch to
the ASGI extension, which has full support for WebSocket as defined in the ASGI
specification.
As before, the WSGI extension is not available under MicroPython.

1
examples/auth/README.md Normal file
View File

@@ -0,0 +1 @@
This directory contains examples that demonstrate basic and token authentication.

View File

@@ -0,0 +1,31 @@
from microdot import Microdot
from microdot.auth import BasicAuth
from pbkdf2 import generate_password_hash, check_password_hash
# this example provides an implementation of the generate_password_hash and
# check_password_hash functions that can be used in MicroPython. On CPython
# there are many other options for password hashisng so there is no need to use
# this custom solution.
USERS = {
'susan': generate_password_hash('hello'),
'david': generate_password_hash('bye'),
}
app = Microdot()
auth = BasicAuth()
@auth.authenticate
async def check_credentials(request, username, password):
if username in USERS and check_password_hash(USERS[username], password):
return username
@app.route('/')
@auth
async def index(request):
return f'Hello, {request.g.current_user}!'
if __name__ == '__main__':
app.run(debug=True)

47
examples/auth/pbkdf2.py Normal file
View File

@@ -0,0 +1,47 @@
import os
import hashlib
# PBKDF2 secure password hashing algorithm obtained from:
# https://codeandlife.com/2023/01/06/how-to-calculate-pbkdf2-hmac-sha256-with-
# python,-example-code/
def sha256(b):
return hashlib.sha256(b).digest()
def ljust(b, n, f):
return b + f * (n - len(b))
def gethmac(key, content):
okeypad = bytes(v ^ 0x5c for v in ljust(key, 64, b'\0'))
ikeypad = bytes(v ^ 0x36 for v in ljust(key, 64, b'\0'))
return sha256(okeypad + sha256(ikeypad + content))
def pbkdf2(pwd, salt, iterations=1000):
U = salt + b'\x00\x00\x00\x01'
T = bytes(64)
for _ in range(iterations):
U = gethmac(pwd, U)
T = bytes(a ^ b for a, b in zip(U, T))
return T
# The number of iterations may need to be adjusted depending on the hardware.
# Lower numbers make the password hashing algorithm faster but less secure, so
# the largest number that can be tolerated should be used.
def generate_password_hash(password, salt=None, iterations=100000):
salt = salt or os.urandom(16)
dk = pbkdf2(password.encode(), salt, iterations)
return f'pbkdf2-hmac-sha256:{salt.hex()}:{iterations}:{dk.hex()}'
def check_password_hash(password_hash, password):
algorithm, salt, iterations, dk = password_hash.split(':')
iterations = int(iterations)
if algorithm != 'pbkdf2-hmac-sha256':
return False
return pbkdf2(password.encode(), salt=bytes.fromhex(salt),
iterations=iterations) == bytes.fromhex(dk)

View File

@@ -0,0 +1,26 @@
from microdot import Microdot
from microdot.auth import TokenAuth
app = Microdot()
auth = TokenAuth()
TOKENS = {
'susan-token': 'susan',
'david-token': 'david',
}
@auth.authenticate
async def check_token(request, token):
if token in TOKENS:
return TOKENS[token]
@app.route('/')
@auth
async def index(request):
return f'Hello, {request.g.current_user}!'
if __name__ == '__main__':
app.run(debug=True)

View File

@@ -9,16 +9,14 @@ aiofiles==23.2.1
annotated-types==0.6.0
# via pydantic
anyio==3.7.1
# via
# fastapi
# starlette
# via starlette
blinker==1.7.0
# via
# flask
# quart
build==1.0.3
# via pip-tools
certifi==2023.11.17
certifi==2024.7.4
# via requests
charset-normalizer==3.3.2
# via requests
@@ -28,15 +26,15 @@ click==8.1.7
# pip-tools
# quart
# uvicorn
fastapi==0.104.1
fastapi==0.109.1
# via -r requirements.in
flask==3.0.0
# via
# -r requirements.in
# quart
gunicorn==21.2.0
gunicorn==23.0.0
# via -r requirements.in
h11==0.14.0
h11==0.16.0
# via
# hypercorn
# uvicorn
@@ -51,7 +49,7 @@ hypercorn==0.15.0
# via quart
hyperframe==6.0.1
# via h2
idna==3.6
idna==3.7
# via
# anyio
# requests
@@ -59,7 +57,7 @@ itsdangerous==2.1.2
# via
# flask
# quart
jinja2==3.1.2
jinja2==3.1.6
# via
# flask
# quart
@@ -84,24 +82,24 @@ pydantic-core==2.14.5
# via pydantic
pyproject-hooks==1.0.0
# via build
quart==0.19.4
quart==0.20.0
# via -r requirements.in
requests==2.31.0
requests==2.32.4
# via -r requirements.in
sniffio==1.3.0
# via anyio
starlette==0.27.0
starlette==0.35.1
# via fastapi
typing-extensions==4.9.0
# via
# fastapi
# pydantic
# pydantic-core
urllib3==2.1.0
urllib3==2.5.0
# via requests
uvicorn==0.24.0.post1
# via -r requirements.in
werkzeug==3.0.1
werkzeug==3.0.6
# via
# flask
# quart

1
examples/login/README.md Normal file
View File

@@ -0,0 +1 @@
This directory contains examples that demonstrate user logins.

123
examples/login/login.py Normal file
View File

@@ -0,0 +1,123 @@
from microdot import Microdot, redirect
from microdot.session import Session
from microdot.login import Login
from pbkdf2 import generate_password_hash, check_password_hash
# this example provides an implementation of the generate_password_hash and
# check_password_hash functions that can be used in MicroPython. On CPython
# there are many other options for password hashisng so there is no need to use
# this custom solution.
class User:
def __init__(self, id, username, password):
self.id = id
self.username = username
self.password_hash = self.create_hash(password)
def create_hash(self, password):
return generate_password_hash(password)
def check_password(self, password):
return check_password_hash(self.password_hash, password)
USERS = {
'user001': User('user001', 'susan', 'hello'),
'user002': User('user002', 'david', 'bye'),
}
app = Microdot()
Session(app, secret_key='top-secret!')
login = Login()
@login.user_loader
async def get_user(user_id):
return USERS.get(user_id)
@app.route('/login', methods=['GET', 'POST'])
async def login_page(request):
if request.method == 'GET':
return '''
<!doctype html>
<html>
<body>
<h1>Please Login</h1>
<form method="POST">
<p>
Username<br>
<input name="username" autofocus>
</p>
<p>
Password:<br>
<input name="password" type="password">
<br>
</p>
<p>
<input name="remember_me" type="checkbox"> Remember me
<br>
</p>
<p>
<button type="submit">Login</button>
</p>
</form>
</body>
</html>
''', {'Content-Type': 'text/html'}
username = request.form['username']
password = request.form['password']
remember_me = bool(request.form.get('remember_me'))
for user in USERS.values():
if user.username == username:
if user.check_password(password):
return await login.login_user(request, user,
remember=remember_me)
return redirect('/login')
@app.route('/')
@login
async def index(request):
return f'''
<!doctype html>
<html>
<body>
<h1>Hello, {request.g.current_user.username}!</h1>
<p>
<a href="/fresh">Click here</a> to access the fresh login page.
</p>
<form method="POST" action="/logout">
<button type="submit">Logout</button>
</form>
</body>
</html>
''', {'Content-Type': 'text/html'}
@app.get('/fresh')
@login.fresh
async def fresh(request):
return f'''
<!doctype html>
<html>
<body>
<h1>Hello, {request.g.current_user.username}!</h1>
<p>This page requires a fresh login session.</p>
<p><a href="/">Go back</a> to the main page.</p>
</body>
</html>
''', {'Content-Type': 'text/html'}
@app.post('/logout')
@login
async def logout(request):
await login.logout_user(request)
return redirect('/')
if __name__ == '__main__':
app.run(debug=True)

47
examples/login/pbkdf2.py Normal file
View File

@@ -0,0 +1,47 @@
import os
import hashlib
# PBKDF2 secure password hashing algorithm obtained from:
# https://codeandlife.com/2023/01/06/how-to-calculate-pbkdf2-hmac-sha256-with-
# python,-example-code/
def sha256(b):
return hashlib.sha256(b).digest()
def ljust(b, n, f):
return b + f * (n - len(b))
def gethmac(key, content):
okeypad = bytes(v ^ 0x5c for v in ljust(key, 64, b'\0'))
ikeypad = bytes(v ^ 0x36 for v in ljust(key, 64, b'\0'))
return sha256(okeypad + sha256(ikeypad + content))
def pbkdf2(pwd, salt, iterations=1000):
U = salt + b'\x00\x00\x00\x01'
T = bytes(64)
for _ in range(iterations):
U = gethmac(pwd, U)
T = bytes(a ^ b for a, b in zip(U, T))
return T
# The number of iterations may need to be adjusted depending on the hardware.
# Lower numbers make the password hashing algorithm faster but less secure, so
# the largest number that can be tolerated should be used.
def generate_password_hash(password, salt=None, iterations=100000):
salt = salt or os.urandom(16)
dk = pbkdf2(password.encode(), salt, iterations)
return f'pbkdf2-hmac-sha256:{salt.hex()}:{iterations}:{dk.hex()}'
def check_password_hash(password_hash, password):
algorithm, salt, iterations, dk = password_hash.split(':')
iterations = int(iterations)
if algorithm != 'pbkdf2-hmac-sha256':
return False
return pbkdf2(password.encode(), salt=bytes.fromhex(salt),
iterations=iterations) == bytes.fromhex(dk)

View File

@@ -1,3 +1,6 @@
# This is a simple example that demonstrates how to use the user session, but
# is not intended as a complete login solution. See the login subdirectory for
# a more complete example.
from microdot import Microdot, Response, redirect
from microdot.session import Session, with_session

View File

@@ -1,16 +1,28 @@
import asyncio
from microdot import Microdot
from microdot import Microdot, send_file
from microdot.sse import with_sse
app = Microdot()
@app.route("/")
async def main(request):
return send_file('index.html')
@app.route('/events')
@with_sse
async def events(request, sse):
for i in range(10):
await asyncio.sleep(1)
await sse.send({'counter': i})
print('Client connected')
try:
i = 0
while True:
await asyncio.sleep(1)
i += 1
await sse.send({'counter': i})
except asyncio.CancelledError:
pass
print('Client disconnected')
app.run(debug=True)
app.run()

30
examples/sse/index.html Normal file
View File

@@ -0,0 +1,30 @@
<!DOCTYPE html>
<html>
<head>
<title>Microdot SSE Example</title>
<meta charset="UTF-8">
</head>
<body>
<h1>Microdot SSE Example</h1>
<div id="log"></div>
<script>
const log = (text, color) => {
document.getElementById('log').innerHTML += `<span style="color: ${color}">${text}</span><br>`;
};
const eventSource = new EventSource('/events');
eventSource.onopen = () => {
log('Connection to server opened.', 'black');
};
eventSource.onmessage = (event) => {
log(`Received message: ${event.data}`, 'blue');
};
eventSource.onerror = (event) => {
log(`EventSource failed: ${event.type}`, 'red');
};
</script>
</body>
</html>

View File

@@ -0,0 +1 @@
This directory contains examples that demonstrate sub-applications.

27
examples/subapps/app.py Normal file
View File

@@ -0,0 +1,27 @@
from microdot import Microdot
from subapp import subapp
app = Microdot()
app.mount(subapp, url_prefix='/subapp')
@app.route('/')
async def hello(request):
return '''
<!DOCTYPE html>
<html>
<head>
<title>Microdot Sub-App Example</title>
<meta charset="UTF-8">
</head>
<body>
<div>
<h1>Microdot Main Page</h1>
<p>Visit the <a href="/subapp">sub-app</a>.</p>
</div>
</body>
</html>
''', 200, {'Content-Type': 'text/html'}
app.run(debug=True)

View File

@@ -0,0 +1,44 @@
from microdot import Microdot
subapp = Microdot()
@subapp.route('')
async def hello(request):
# request.url_prefix can be used in links that are relative to this subapp
return f'''
<!DOCTYPE html>
<html>
<head>
<title>Microdot Sub-App Example</title>
<meta charset="UTF-8">
</head>
<body>
<div>
<h1>Microdot Sub-App Main Page</h1>
<p>Visit the sub-app's <a href="{request.url_prefix}/second">secondary page</a>.</p>
<p>Go back to the app's <a href="/">main page</a>.</p>
</div>
</body>
</html>
''', 200, {'Content-Type': 'text/html'} # noqa: E501
@subapp.route('/second')
async def second(request):
return f'''
<!DOCTYPE html>
<html>
<head>
<title>Microdot Sub-App Example</title>
<meta charset="UTF-8">
</head>
<body>
<div>
<h1>Microdot Sub-App Secondary Page</h1>
<p>Visit the sub-app's <a href="{request.url_prefix}">main page</a>.</p>
<p>Go back to the app's <a href="/">main page</a>.</p>
</div>
</body>
</html>
''', 200, {'Content-Type': 'text/html'} # noqa: E501

View File

@@ -1,4 +1,5 @@
import ssl
import sys
from microdot import Microdot
app = Microdot()
@@ -31,6 +32,7 @@ async def shutdown(request):
return 'The server is shutting down...'
ext = 'der' if sys.implementation.name == 'micropython' else 'pem'
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
sslctx.load_cert_chain('cert.pem', 'key.pem')
sslctx.load_cert_chain('cert.' + ext, 'key.' + ext)
app.run(port=4443, debug=True, ssl=sslctx)

View File

@@ -1 +1,4 @@
This directory contains file upload examples.
- `simple_uploads.py` demonstrates how to upload a single file.
- `formdata.py` demonstrates how to process a form that includes file uploads.

View File

@@ -0,0 +1,17 @@
<!doctype html>
<html>
<head>
<title>Microdot Multipart Form-Data Example</title>
<meta charset="UTF-8">
</head>
<body>
<h1>Microdot Multipart Form-Data Example</h1>
<form method="POST" action="" enctype="multipart/form-data">
<p>Name: <input type="text" name="name" /></p>
<p>Age: <input type="text" name="age" /></p>
<p>Comments: <textarea name="comments" rows="4"></textarea></p>
<p>File: <input type="file" id="file" name="file" /></p>
<input type="submit" value="Submit" />
</form>
</body>
</html>

View File

@@ -0,0 +1,26 @@
from microdot import Microdot, send_file, Request
from microdot.multipart import with_form_data
app = Microdot()
Request.max_content_length = 1024 * 1024 # 1MB (change as needed)
@app.get('/')
async def index(request):
return send_file('formdata.html')
@app.post('/')
@with_form_data
async def upload(request):
print('Form fields:')
for field, value in request.form.items():
print(f'- {field}: {value}')
print('\nFile uploads:')
for field, value in request.files.items():
print(f'- {field}: {value.filename}, {await value.read()}')
return 'We have received your data!'
if __name__ == '__main__':
app.run(debug=True)

View File

@@ -6,7 +6,7 @@ Request.max_content_length = 1024 * 1024 # 1MB (change as needed)
@app.get('/')
async def index(request):
return send_file('index.html')
return send_file('simple_uploads.html')
@app.post('/upload')

View File

@@ -0,0 +1,139 @@
# SPDX-FileCopyrightText: 2017 Scott Shawcroft, written for Adafruit Industries
# SPDX-FileCopyrightText: Copyright (c) 2021 Jeff Epler for Adafruit Industries
#
# SPDX-License-Identifier: MIT
"""
`adafruit_ticks`
================================================================================
Work with intervals and deadlines in milliseconds
* Author(s): Jeff Epler
Implementation Notes
--------------------
**Software and Dependencies:**
* Adafruit CircuitPython firmware for the supported boards:
https://github.com/adafruit/circuitpython/releases
"""
# imports
from micropython import const
__version__ = "0.0.0+auto.0"
__repo__ = "https://github.com/adafruit/Adafruit_CircuitPython_ticks.git"
_TICKS_PERIOD = const(1 << 29)
_TICKS_MAX = const(_TICKS_PERIOD - 1)
_TICKS_HALFPERIOD = const(_TICKS_PERIOD // 2)
# Get the correct implementation of ticks_ms. There are three possibilities:
#
# - supervisor.ticks_ms is present. This will be the case starting in CP7.0
#
# - time.ticks_ms is present. This is the case for MicroPython & for the "unix
# port" of CircuitPython, used for some automated testing.
#
# - time.monotonic_ns is present, and works. This is the case on most
# Express boards in CP6.x, and most host computer versions of Python.
#
# - Otherwise, time.monotonic is assumed to be present. This is the case
# on most non-express boards in CP6.x, and some old host computer versions
# of Python.
#
# Note that on microcontrollers, this time source becomes increasingly
# inaccurate when the board has not been reset in a long time, losing the
# ability to measure 1ms intervals after about 1 hour, and losing the
# ability to meausre 128ms intervals after 6 days. The only solution is to
# either upgrade to a version with supervisor.ticks_ms, or to switch to a
# board with time.monotonic_ns.
try:
from supervisor import ticks_ms # pylint: disable=unused-import
except (ImportError, NameError):
import time
if _ticks_ms := getattr(time, "ticks_ms", None):
def ticks_ms() -> int:
"""Return the time in milliseconds since an unspecified moment,
wrapping after 2**29ms.
The wrap value was chosen so that it is always possible to add or
subtract two `ticks_ms` values without overflow on a board without
long ints (or without allocating any long integer objects, on
boards with long ints).
This ticks value comes from a low-accuracy clock internal to the
microcontroller, just like `time.monotonic`. Due to its low
accuracy and the fact that it "wraps around" every few days, it is
intended for working with short term events like advancing an LED
animation, not for long term events like counting down the time
until a holiday."""
return _ticks_ms() & _TICKS_MAX # pylint: disable=not-callable
else:
try:
from time import monotonic_ns as _monotonic_ns
_monotonic_ns() # Check that monotonic_ns is usable
def ticks_ms() -> int:
"""Return the time in milliseconds since an unspecified moment,
wrapping after 2**29ms.
The wrap value was chosen so that it is always possible to add or
subtract two `ticks_ms` values without overflow on a board without
long ints (or without allocating any long integer objects, on
boards with long ints).
This ticks value comes from a low-accuracy clock internal to the
microcontroller, just like `time.monotonic`. Due to its low
accuracy and the fact that it "wraps around" every few days, it is
intended for working with short term events like advancing an LED
animation, not for long term events like counting down the time
until a holiday."""
return (_monotonic_ns() // 1_000_000) & _TICKS_MAX
except (ImportError, NameError, NotImplementedError):
from time import monotonic as _monotonic
def ticks_ms() -> int:
"""Return the time in milliseconds since an unspecified moment,
wrapping after 2**29ms.
The wrap value was chosen so that it is always possible to add or
subtract two `ticks_ms` values without overflow on a board without
long ints (or without allocating any long integer objects, on
boards with long ints).
This ticks value comes from a low-accuracy clock internal to the
microcontroller, just like `time.monotonic`. Due to its low
accuracy and the fact that it "wraps around" every few days, it is
intended for working with short term events like advancing an LED
animation, not for long term events like counting down the time
until a holiday."""
return int(_monotonic() * 1000) & _TICKS_MAX
def ticks_add(ticks: int, delta: int) -> int:
"Add a delta to a base number of ticks, performing wraparound at 2**29ms."
return (ticks + delta) % _TICKS_PERIOD
def ticks_diff(ticks1: int, ticks2: int) -> int:
"""Compute the signed difference between two ticks values,
assuming that they are within 2**28 ticks"""
diff = (ticks1 - ticks2) & _TICKS_MAX
diff = ((diff + _TICKS_HALFPERIOD) & _TICKS_MAX) - _TICKS_HALFPERIOD
return diff
def ticks_less(ticks1: int, ticks2: int) -> bool:
"""Return true if ticks1 is before ticks2 and false otherwise,
assuming that they are within 2**28 ticks"""
return ticks_diff(ticks1, ticks2) < 0

View File

@@ -0,0 +1,41 @@
# SPDX-FileCopyrightText: 2019 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
from .core import *
__version__ = "0.0.0+auto.0"
__repo__ = "https://github.com/Adafruit/Adafruit_CircuitPython_asyncio.git"
_attrs = {
"wait_for": "funcs",
"wait_for_ms": "funcs",
"gather": "funcs",
"Event": "event",
"ThreadSafeFlag": "event",
"Lock": "lock",
"open_connection": "stream",
"start_server": "stream",
"StreamReader": "stream",
"StreamWriter": "stream",
}
# Lazy loader, effectively does:
# global attr
# from .mod import attr
def __getattr__(attr):
mod = _attrs.get(attr, None)
if mod is None:
raise AttributeError(attr)
value = getattr(__import__(mod, None, None, True, 1), attr)
globals()[attr] = value
return value

View File

@@ -0,0 +1,430 @@
# SPDX-FileCopyrightText: 2019 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
"""
Core
====
"""
from adafruit_ticks import ticks_ms as ticks, ticks_diff, ticks_add
import sys, select
try:
from traceback import print_exception
except:
from .traceback import print_exception
# Import TaskQueue and Task, preferring built-in C code over Python code
try:
from _asyncio import TaskQueue, Task
except ImportError:
from .task import TaskQueue, Task
################################################################################
# Exceptions
# Depending on the release of CircuitPython these errors may or may not
# exist in the C implementation of `_asyncio`. However, when they
# do exist, they must be preferred over the Python code.
try:
from _asyncio import CancelledError, InvalidStateError
except (ImportError, AttributeError):
class CancelledError(BaseException):
"""Injected into a task when calling `Task.cancel()`"""
pass
class InvalidStateError(Exception):
"""Can be raised in situations like setting a result value for a task object that already has a result value set."""
pass
class TimeoutError(Exception):
"""Raised when waiting for a task longer than the specified timeout."""
pass
# Used when calling Loop.call_exception_handler
_exc_context = {"message": "Task exception wasn't retrieved", "exception": None, "future": None}
################################################################################
# Sleep functions
# "Yield" once, then raise StopIteration
class SingletonGenerator:
def __init__(self):
self.state = None
self.exc = StopIteration()
def __iter__(self):
return self
def __await__(self):
return self
def __next__(self):
if self.state is not None:
_task_queue.push_sorted(cur_task, self.state)
self.state = None
return None
else:
self.exc.__traceback__ = None
raise self.exc
# Pause task execution for the given time (integer in milliseconds, uPy extension)
# Use a SingletonGenerator to do it without allocating on the heap
def sleep_ms(t, sgen=SingletonGenerator()):
"""Sleep for *t* milliseconds.
This is a coroutine, and a MicroPython extension.
"""
assert sgen.state is None, "Check for a missing `await` in your code"
sgen.state = ticks_add(ticks(), max(0, t))
return sgen
# Pause task execution for the given time (in seconds)
def sleep(t):
"""Sleep for *t* seconds
This is a coroutine.
"""
return sleep_ms(int(t * 1000))
################################################################################
# "Never schedule" object"
# Don't re-schedule the object that awaits _never().
# For internal use only. Some constructs, like `await event.wait()`,
# work by NOT re-scheduling the task which calls wait(), but by
# having some other task schedule it later.
class _NeverSingletonGenerator:
def __init__(self):
self.state = None
self.exc = StopIteration()
def __iter__(self):
return self
def __await__(self):
return self
def __next__(self):
if self.state is not None:
self.state = None
return None
else:
self.exc.__traceback__ = None
raise self.exc
def _never(sgen=_NeverSingletonGenerator()):
# assert sgen.state is None, "Check for a missing `await` in your code"
sgen.state = False
return sgen
################################################################################
# Queue and poller for stream IO
class IOQueue:
def __init__(self):
self.poller = select.poll()
self.map = {} # maps id(stream) to [task_waiting_read, task_waiting_write, stream]
def _enqueue(self, s, idx):
if id(s) not in self.map:
entry = [None, None, s]
entry[idx] = cur_task
self.map[id(s)] = entry
self.poller.register(s, select.POLLIN if idx == 0 else select.POLLOUT)
else:
sm = self.map[id(s)]
assert sm[idx] is None
assert sm[1 - idx] is not None
sm[idx] = cur_task
self.poller.modify(s, select.POLLIN | select.POLLOUT)
# Link task to this IOQueue so it can be removed if needed
cur_task.data = self
def _dequeue(self, s):
del self.map[id(s)]
self.poller.unregister(s)
async def queue_read(self, s):
self._enqueue(s, 0)
await _never()
async def queue_write(self, s):
self._enqueue(s, 1)
await _never()
def remove(self, task):
while True:
del_s = None
for k in self.map: # Iterate without allocating on the heap
q0, q1, s = self.map[k]
if q0 is task or q1 is task:
del_s = s
break
if del_s is not None:
self._dequeue(s)
else:
break
def wait_io_event(self, dt):
for s, ev in self.poller.ipoll(dt):
sm = self.map[id(s)]
# print('poll', s, sm, ev)
if ev & ~select.POLLOUT and sm[0] is not None:
# POLLIN or error
_task_queue.push_head(sm[0])
sm[0] = None
if ev & ~select.POLLIN and sm[1] is not None:
# POLLOUT or error
_task_queue.push_head(sm[1])
sm[1] = None
if sm[0] is None and sm[1] is None:
self._dequeue(s)
elif sm[0] is None:
self.poller.modify(s, select.POLLOUT)
else:
self.poller.modify(s, select.POLLIN)
################################################################################
# Main run loop
# Ensure the awaitable is a task
def _promote_to_task(aw):
return aw if isinstance(aw, Task) else create_task(aw)
# Create and schedule a new task from a coroutine
def create_task(coro):
"""Create a new task from the given coroutine and schedule it to run.
Returns the corresponding `Task` object.
"""
if not hasattr(coro, "send"):
raise TypeError("coroutine expected")
t = Task(coro, globals())
_task_queue.push_head(t)
return t
# Keep scheduling tasks until there are none left to schedule
def run_until_complete(main_task=None):
"""Run the given *main_task* until it completes."""
global cur_task
excs_all = (CancelledError, Exception) # To prevent heap allocation in loop
excs_stop = (CancelledError, StopIteration) # To prevent heap allocation in loop
while True:
# Wait until the head of _task_queue is ready to run
dt = 1
while dt > 0:
dt = -1
t = _task_queue.peek()
if t:
# A task waiting on _task_queue; "ph_key" is time to schedule task at
dt = max(0, ticks_diff(t.ph_key, ticks()))
elif not _io_queue.map:
# No tasks can be woken so finished running
return
# print('(poll {})'.format(dt), len(_io_queue.map))
_io_queue.wait_io_event(dt)
# Get next task to run and continue it
t = _task_queue.pop_head()
cur_task = t
try:
# Continue running the coroutine, it's responsible for rescheduling itself
exc = t.data
if not exc:
t.coro.send(None)
else:
# If the task is finished and on the run queue and gets here, then it
# had an exception and was not await'ed on. Throwing into it now will
# raise StopIteration and the code below will catch this and run the
# call_exception_handler function.
t.data = None
t.coro.throw(exc)
except excs_all as er:
# Check the task is not on any event queue
assert t.data is None
# This task is done, check if it's the main task and then loop should stop
if t is main_task:
if isinstance(er, StopIteration):
return er.value
raise er
if t.state:
# Task was running but is now finished.
waiting = False
if t.state is True:
# "None" indicates that the task is complete and not await'ed on (yet).
t.state = None
elif callable(t.state):
# The task has a callback registered to be called on completion.
t.state(t, er)
t.state = False
waiting = True
else:
# Schedule any other tasks waiting on the completion of this task.
while t.state.peek():
_task_queue.push_head(t.state.pop_head())
waiting = True
# "False" indicates that the task is complete and has been await'ed on.
t.state = False
if not waiting and not isinstance(er, excs_stop):
# An exception ended this detached task, so queue it for later
# execution to handle the uncaught exception if no other task retrieves
# the exception in the meantime (this is handled by Task.throw).
_task_queue.push_head(t)
# Save return value of coro to pass up to caller.
t.data = er
elif t.state is None:
# Task is already finished and nothing await'ed on the task,
# so call the exception handler.
_exc_context["exception"] = exc
_exc_context["future"] = t
Loop.call_exception_handler(_exc_context)
# Create a new task from a coroutine and run it until it finishes
def run(coro):
"""Create a new task from the given coroutine and run it until it completes.
Returns the value returned by *coro*.
"""
return run_until_complete(create_task(coro))
################################################################################
# Event loop wrapper
async def _stopper():
pass
_stop_task = None
class Loop:
"""Class representing the event loop"""
_exc_handler = None
def create_task(coro):
"""Create a task from the given *coro* and return the new `Task` object."""
return create_task(coro)
def run_forever():
"""Run the event loop until `Loop.stop()` is called."""
global _stop_task
_stop_task = Task(_stopper(), globals())
run_until_complete(_stop_task)
# TODO should keep running until .stop() is called, even if there're no tasks left
def run_until_complete(aw):
"""Run the given *awaitable* until it completes. If *awaitable* is not a task then
it will be promoted to one.
"""
return run_until_complete(_promote_to_task(aw))
def stop():
"""Stop the event loop"""
global _stop_task
if _stop_task is not None:
_task_queue.push_head(_stop_task)
# If stop() is called again, do nothing
_stop_task = None
def close():
"""Close the event loop."""
pass
def set_exception_handler(handler):
"""Set the exception handler to call when a Task raises an exception that is not
caught. The *handler* should accept two arguments: ``(loop, context)``
"""
Loop._exc_handler = handler
def get_exception_handler():
"""Get the current exception handler. Returns the handler, or ``None`` if no
custom handler is set.
"""
return Loop._exc_handler
def default_exception_handler(loop, context):
"""The default exception handler that is called."""
exc = context["exception"]
print_exception(None, exc, exc.__traceback__)
def call_exception_handler(context):
"""Call the current exception handler. The argument *context* is passed through
and is a dictionary containing keys:
``'message'``, ``'exception'``, ``'future'``
"""
(Loop._exc_handler or Loop.default_exception_handler)(Loop, context)
# The runq_len and waitq_len arguments are for legacy uasyncio compatibility
def get_event_loop(runq_len=0, waitq_len=0):
"""Return the event loop used to schedule and run tasks. See `Loop`."""
return Loop
def current_task():
"""Return the `Task` object associated with the currently running task."""
return cur_task
def new_event_loop():
"""Reset the event loop and return it.
**NOTE**: Since MicroPython only has a single event loop, this function just resets
the loop's state, it does not create a new one
"""
global _task_queue, _io_queue, _exc_context, cur_task
# TaskQueue of Task instances
_task_queue = TaskQueue()
# Task queue and poller for stream IO
_io_queue = IOQueue()
cur_task = None
_exc_context['exception'] = None
_exc_context['future'] = None
return Loop
# Initialise default event loop
new_event_loop()

View File

@@ -0,0 +1,92 @@
# SPDX-FileCopyrightText: 2019-2020 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019-2020 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
"""
Events
======
"""
from . import core
# Event class for primitive events that can be waited on, set, and cleared
class Event:
"""Create a new event which can be used to synchronize tasks. Events
start in the cleared state.
"""
def __init__(self):
self.state = False # False=unset; True=set
self.waiting = core.TaskQueue() # Queue of Tasks waiting on completion of this event
def is_set(self):
"""Returns ``True`` if the event is set, ``False`` otherwise."""
return self.state
def set(self):
"""Set the event. Any tasks waiting on the event will be scheduled to run.
"""
# Event becomes set, schedule any tasks waiting on it
# Note: This must not be called from anything except the thread running
# the asyncio loop (i.e. neither hard or soft IRQ, or a different thread).
while self.waiting.peek():
core._task_queue.push_head(self.waiting.pop_head())
self.state = True
def clear(self):
"""Clear the event."""
self.state = False
async def wait(self):
"""Wait for the event to be set. If the event is already set then it returns
immediately.
This is a coroutine.
"""
if not self.state:
# Event not set, put the calling task on the event's waiting queue
self.waiting.push_head(core.cur_task)
# Set calling task's data to the event's queue so it can be removed if needed
core.cur_task.data = self.waiting
await core._never()
return True
# MicroPython-extension: This can be set from outside the asyncio event loop,
# such as other threads, IRQs or scheduler context. Implementation is a stream
# that asyncio will poll until a flag is set.
# Note: Unlike Event, this is self-clearing.
try:
import uio
class ThreadSafeFlag(uio.IOBase):
def __init__(self):
self._flag = 0
def ioctl(self, req, flags):
if req == 3: # MP_STREAM_POLL
return self._flag * flags
return None
def set(self):
self._flag = 1
async def wait(self):
if not self._flag:
yield core._io_queue.queue_read(self)
self._flag = 0
except ImportError:
pass

View File

@@ -0,0 +1,165 @@
# SPDX-FileCopyrightText: 2019-2020 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019-2022 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
"""
Functions
=========
"""
from . import core
async def _run(waiter, aw):
try:
result = await aw
status = True
except BaseException as er:
result = None
status = er
if waiter.data is None:
# The waiter is still waiting, cancel it.
if waiter.cancel():
# Waiter was cancelled by us, change its CancelledError to an instance of
# CancelledError that contains the status and result of waiting on aw.
# If the wait_for task subsequently gets cancelled externally then this
# instance will be reset to a CancelledError instance without arguments.
waiter.data = core.CancelledError(status, result)
async def wait_for(aw, timeout, sleep=core.sleep):
"""Wait for the *aw* awaitable to complete, but cancel if it takes longer
than *timeout* seconds. If *aw* is not a task then a task will be created
from it.
If a timeout occurs, it cancels the task and raises ``asyncio.TimeoutError``:
this should be trapped by the caller.
Returns the return value of *aw*.
This is a coroutine.
"""
aw = core._promote_to_task(aw)
if timeout is None:
return await aw
# Run aw in a separate runner task that manages its exceptions.
runner_task = core.create_task(_run(core.cur_task, aw))
try:
# Wait for the timeout to elapse.
await sleep(timeout)
except core.CancelledError as er:
status = er.args[0] if er.args else None
if status is None:
# This wait_for was cancelled externally, so cancel aw and re-raise.
runner_task.cancel()
raise er
elif status is True:
# aw completed successfully and cancelled the sleep, so return aw's result.
return er.args[1]
else:
# aw raised an exception, propagate it out to the caller.
raise status
# The sleep finished before aw, so cancel aw and raise TimeoutError.
runner_task.cancel()
await runner_task
raise core.TimeoutError
def wait_for_ms(aw, timeout):
"""Similar to `wait_for` but *timeout* is an integer in milliseconds.
This is a coroutine, and a MicroPython extension.
"""
return wait_for(aw, timeout, core.sleep_ms)
class _Remove:
@staticmethod
def remove(t):
pass
async def gather(*aws, return_exceptions=False):
"""Run all *aws* awaitables concurrently. Any *aws* that are not tasks
are promoted to tasks.
Returns a list of return values of all *aws*
"""
if not aws:
return []
def done(t, er):
# Sub-task "t" has finished, with exception "er".
nonlocal state
if gather_task.data is not _Remove:
# The main gather task has already been scheduled, so do nothing.
# This happens if another sub-task already raised an exception and
# woke the main gather task (via this done function), or if the main
# gather task was cancelled externally.
return
elif not return_exceptions and not isinstance(er, StopIteration):
# A sub-task raised an exception, indicate that to the gather task.
state = er
else:
state -= 1
if state:
# Still some sub-tasks running.
return
# Gather waiting is done, schedule the main gather task.
core._task_queue.push_head(gather_task)
ts = [core._promote_to_task(aw) for aw in aws]
for i in range(len(ts)):
if ts[i].state is not True:
# Task is not running, gather not currently supported for this case.
raise RuntimeError("can't gather")
# Register the callback to call when the task is done.
ts[i].state = done
# Set the state for execution of the gather.
gather_task = core.cur_task
state = len(ts)
cancel_all = False
# Wait for the a sub-task to need attention.
gather_task.data = _Remove
try:
await core._never()
except core.CancelledError as er:
cancel_all = True
state = er
# Clean up tasks.
for i in range(len(ts)):
if ts[i].state is done:
# Sub-task is still running, deregister the callback and cancel if needed.
ts[i].state = True
if cancel_all:
ts[i].cancel()
elif isinstance(ts[i].data, StopIteration):
# Sub-task ran to completion, get its return value.
ts[i] = ts[i].data.value
else:
# Sub-task had an exception with return_exceptions==True, so get its exception.
ts[i] = ts[i].data
# Either this gather was cancelled, or one of the sub-tasks raised an exception with
# return_exceptions==False, so reraise the exception here.
if state is not 0:
raise state
# Return the list of return values of each sub-task.
return ts

View File

@@ -0,0 +1,87 @@
# SPDX-FileCopyrightText: 2019-2020 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019-2020 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
"""
Locks
=====
"""
from . import core
# Lock class for primitive mutex capability
class Lock:
"""Create a new lock which can be used to coordinate tasks. Locks start in
the unlocked state.
In addition to the methods below, locks can be used in an ``async with``
statement.
"""
def __init__(self):
# The state can take the following values:
# - 0: unlocked
# - 1: locked
# - <Task>: unlocked but this task has been scheduled to acquire the lock next
self.state = 0
# Queue of Tasks waiting to acquire this Lock
self.waiting = core.TaskQueue()
def locked(self):
"""Returns ``True`` if the lock is locked, otherwise ``False``."""
return self.state == 1
def release(self):
"""Release the lock. If any tasks are waiting on the lock then the next
one in the queue is scheduled to run and the lock remains locked. Otherwise,
no tasks are waiting and the lock becomes unlocked.
"""
if self.state != 1:
raise RuntimeError("Lock not acquired")
if self.waiting.peek():
# Task(s) waiting on lock, schedule next Task
self.state = self.waiting.pop_head()
core._task_queue.push_head(self.state)
else:
# No Task waiting so unlock
self.state = 0
async def acquire(self):
"""Wait for the lock to be in the unlocked state and then lock it in an
atomic way. Only one task can acquire the lock at any one time.
This is a coroutine.
"""
if self.state != 0:
# Lock unavailable, put the calling Task on the waiting queue
self.waiting.push_head(core.cur_task)
# Set calling task's data to the lock's queue so it can be removed if needed
core.cur_task.data = self.waiting
try:
await core._never()
except core.CancelledError as er:
if self.state == core.cur_task:
# Cancelled while pending on resume, schedule next waiting Task
self.state = 1
self.release()
raise er
# Lock available, set it as locked
self.state = 1
return True
async def __aenter__(self):
return await self.acquire()
async def __aexit__(self, exc_type, exc, tb):
return self.release()

View File

@@ -0,0 +1,24 @@
# SPDX-FileCopyrightText: 2019 Damien P. George
#
# SPDX-License-Identifier: MIT
#
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
# This list of frozen files doesn't include task.py because that's provided by the C module.
freeze(
"..",
(
"uasyncio/__init__.py",
"uasyncio/core.py",
"uasyncio/event.py",
"uasyncio/funcs.py",
"uasyncio/lock.py",
"uasyncio/stream.py",
),
opt=3,
)

View File

@@ -0,0 +1,263 @@
# SPDX-FileCopyrightText: 2019-2020 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019-2020 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
"""
Streams
=======
"""
from . import core
class Stream:
"""This represents a TCP stream connection. To minimise code this class
implements both a reader and a writer, and both ``StreamReader`` and
``StreamWriter`` alias to this class.
"""
def __init__(self, s, e={}):
self.s = s
self.e = e
self.out_buf = b""
def get_extra_info(self, v):
"""Get extra information about the stream, given by *v*. The valid
values for *v* are: ``peername``.
"""
return self.e[v]
async def __aenter__(self):
return self
async def __aexit__(self, exc_type, exc, tb):
await self.close()
def close(self):
pass
async def wait_closed(self):
"""Wait for the stream to close.
This is a coroutine.
"""
# TODO yield?
self.s.close()
async def read(self, n):
"""Read up to *n* bytes and return them.
This is a coroutine.
"""
await core._io_queue.queue_read(self.s)
return self.s.read(n)
async def readinto(self, buf):
"""Read up to n bytes into *buf* with n being equal to the length of *buf*
Return the number of bytes read into *buf*
This is a coroutine, and a MicroPython extension.
"""
await core._io_queue.queue_read(self.s)
return self.s.readinto(buf)
async def readexactly(self, n):
"""Read exactly *n* bytes and return them as a bytes object.
Raises an ``EOFError`` exception if the stream ends before reading
*n* bytes.
This is a coroutine.
"""
r = b""
while n:
await core._io_queue.queue_read(self.s)
r2 = self.s.read(n)
if r2 is not None:
if not len(r2):
raise EOFError
r += r2
n -= len(r2)
return r
async def readline(self):
"""Read a line and return it.
This is a coroutine.
"""
l = b""
while True:
await core._io_queue.queue_read(self.s)
l2 = self.s.readline() # may do multiple reads but won't block
l += l2
if not l2 or l[-1] == 10: # \n (check l in case l2 is str)
return l
def write(self, buf):
"""Accumulated *buf* to the output buffer. The data is only flushed when
`Stream.drain` is called. It is recommended to call `Stream.drain`
immediately after calling this function.
"""
if not self.out_buf:
# Try to write immediately to the underlying stream.
ret = self.s.write(buf)
if ret == len(buf):
return
if ret is not None:
buf = buf[ret:]
self.out_buf += buf
async def drain(self):
"""Drain (write) all buffered output data out to the stream.
This is a coroutine.
"""
mv = memoryview(self.out_buf)
off = 0
while off < len(mv):
await core._io_queue.queue_write(self.s)
ret = self.s.write(mv[off:])
if ret is not None:
off += ret
self.out_buf = b""
# Stream can be used for both reading and writing to save code size
StreamReader = Stream
StreamWriter = Stream
# Create a TCP stream connection to a remote host
async def open_connection(host, port):
"""Open a TCP connection to the given *host* and *port*. The *host* address will
be resolved using `socket.getaddrinfo`, which is currently a blocking call.
Returns a pair of streams: a reader and a writer stream. Will raise a socket-specific
``OSError`` if the host could not be resolved or if the connection could not be made.
This is a coroutine.
"""
from uerrno import EINPROGRESS
import usocket as socket
ai = socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM)[0] # TODO this is blocking!
s = socket.socket(ai[0], ai[1], ai[2])
s.setblocking(False)
ss = Stream(s)
try:
s.connect(ai[-1])
except OSError as er:
if er.errno != EINPROGRESS:
raise er
await core._io_queue.queue_write(s)
return ss, ss
# Class representing a TCP stream server, can be closed and used in "async with"
class Server:
"""This represents the server class returned from `start_server`. It can be used in
an ``async with`` statement to close the server upon exit.
"""
async def __aenter__(self):
return self
async def __aexit__(self, exc_type, exc, tb):
self.close()
await self.wait_closed()
def close(self):
"""Close the server."""
self.task.cancel()
async def wait_closed(self):
"""Wait for the server to close.
This is a coroutine.
"""
await self.task
async def _serve(self, s, cb):
# Accept incoming connections
while True:
try:
await core._io_queue.queue_read(s)
except core.CancelledError:
# Shutdown server
s.close()
return
try:
s2, addr = s.accept()
except:
# Ignore a failed accept
continue
s2.setblocking(False)
s2s = Stream(s2, {"peername": addr})
core.create_task(cb(s2s, s2s))
# Helper function to start a TCP stream server, running as a new task
# TODO could use an accept-callback on socket read activity instead of creating a task
async def start_server(cb, host, port, backlog=5):
"""Start a TCP server on the given *host* and *port*. The *cb* callback will be
called with incoming, accepted connections, and be passed 2 arguments: reader
writer streams for the connection.
Returns a `Server` object.
This is a coroutine.
"""
import usocket as socket
# Create and bind server socket.
host = socket.getaddrinfo(host, port)[0] # TODO this is blocking!
s = socket.socket()
s.setblocking(False)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(host[-1])
s.listen(backlog)
# Create and return server object and task.
srv = Server()
srv.task = core.create_task(srv._serve(s, cb))
return srv
################################################################################
# Legacy uasyncio compatibility
async def stream_awrite(self, buf, off=0, sz=-1):
if off != 0 or sz != -1:
buf = memoryview(buf)
if sz == -1:
sz = len(buf)
buf = buf[off : off + sz]
self.write(buf)
await self.drain()
Stream.aclose = Stream.wait_closed
Stream.awrite = stream_awrite
Stream.awritestr = stream_awrite # TODO explicitly convert to bytes?

View File

@@ -0,0 +1,215 @@
# SPDX-FileCopyrightText: 2019-2020 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019-2020 Damien P. George
#
# This code comes from MicroPython, and has not been run through black or pylint there.
# Altering these files significantly would make merging difficult, so we will not use
# pylint or black.
# pylint: skip-file
# fmt: off
"""
Tasks
=====
"""
# This file contains the core TaskQueue based on a pairing heap, and the core Task class.
# They can optionally be replaced by C implementations.
from . import core
# pairing-heap meld of 2 heaps; O(1)
def ph_meld(h1, h2):
if h1 is None:
return h2
if h2 is None:
return h1
lt = core.ticks_diff(h1.ph_key, h2.ph_key) < 0
if lt:
if h1.ph_child is None:
h1.ph_child = h2
else:
h1.ph_child_last.ph_next = h2
h1.ph_child_last = h2
h2.ph_next = None
h2.ph_rightmost_parent = h1
return h1
else:
h1.ph_next = h2.ph_child
h2.ph_child = h1
if h1.ph_next is None:
h2.ph_child_last = h1
h1.ph_rightmost_parent = h2
return h2
# pairing-heap pairing operation; amortised O(log N)
def ph_pairing(child):
heap = None
while child is not None:
n1 = child
child = child.ph_next
n1.ph_next = None
if child is not None:
n2 = child
child = child.ph_next
n2.ph_next = None
n1 = ph_meld(n1, n2)
heap = ph_meld(heap, n1)
return heap
# pairing-heap delete of a node; stable, amortised O(log N)
def ph_delete(heap, node):
if node is heap:
child = heap.ph_child
node.ph_child = None
return ph_pairing(child)
# Find parent of node
parent = node
while parent.ph_next is not None:
parent = parent.ph_next
parent = parent.ph_rightmost_parent
# Replace node with pairing of its children
if node is parent.ph_child and node.ph_child is None:
parent.ph_child = node.ph_next
node.ph_next = None
return heap
elif node is parent.ph_child:
child = node.ph_child
next = node.ph_next
node.ph_child = None
node.ph_next = None
node = ph_pairing(child)
parent.ph_child = node
else:
n = parent.ph_child
while node is not n.ph_next:
n = n.ph_next
child = node.ph_child
next = node.ph_next
node.ph_child = None
node.ph_next = None
node = ph_pairing(child)
if node is None:
node = n
else:
n.ph_next = node
node.ph_next = next
if next is None:
node.ph_rightmost_parent = parent
parent.ph_child_last = node
return heap
# TaskQueue class based on the above pairing-heap functions.
class TaskQueue:
def __init__(self):
self.heap = None
def peek(self):
return self.heap
def push(self, v, key=None):
assert v.ph_child is None
assert v.ph_next is None
v.data = None
v.ph_key = key if key is not None else core.ticks()
self.heap = ph_meld(v, self.heap)
def pop(self):
v = self.heap
assert v.ph_next is None
self.heap = ph_pairing(v.ph_child)
v.ph_child = None
return v
def remove(self, v):
self.heap = ph_delete(self.heap, v)
# Compatibility aliases, remove after they are no longer used
push_head = push
push_sorted = push
pop_head = pop
# Task class representing a coroutine, can be waited on and cancelled.
class Task:
"""This object wraps a coroutine into a running task. Tasks can be waited on
using ``await task``, which will wait for the task to complete and return the
return value of the task.
Tasks should not be created directly, rather use ``create_task`` to create them.
"""
def __init__(self, coro, globals=None):
self.coro = coro # Coroutine of this Task
self.data = None # General data for queue it is waiting on
self.state = True # None, False, True, a callable, or a TaskQueue instance
self.ph_key = 0 # Pairing heap
self.ph_child = None # Paring heap
self.ph_child_last = None # Paring heap
self.ph_next = None # Paring heap
self.ph_rightmost_parent = None # Paring heap
def __iter__(self):
if not self.state:
# Task finished, signal that is has been await'ed on.
self.state = False
elif self.state is True:
# Allocated head of linked list of Tasks waiting on completion of this task.
self.state = TaskQueue()
elif type(self.state) is not TaskQueue:
# Task has state used for another purpose, so can't also wait on it.
raise RuntimeError("can't wait")
return self
# CircuitPython needs __await()__.
__await__ = __iter__
def __next__(self):
if not self.state:
if self.data is None:
# Task finished but has already been sent to the loop's exception handler.
raise StopIteration
else:
# Task finished, raise return value to caller so it can continue.
raise self.data
else:
# Put calling task on waiting queue.
self.state.push(core.cur_task)
# Set calling task's data to this task that it waits on, to double-link it.
core.cur_task.data = self
def done(self):
"""Whether the task is complete."""
return not self.state
def cancel(self):
"""Cancel the task by injecting a ``CancelledError`` into it. The task
may or may not ignore this exception.
"""
# Check if task is already finished.
if not self.state:
return False
# Can't cancel self (not supported yet).
if self is core.cur_task:
raise RuntimeError("can't cancel self")
# If Task waits on another task then forward the cancel to the one it's waiting on.
while isinstance(self.data, Task):
self = self.data
# Reschedule Task as a cancelled task.
if hasattr(self.data, "remove"):
# Not on the main running queue, remove the task from the queue it's on.
self.data.remove(self)
core._task_queue.push(self)
elif core.ticks_diff(self.ph_key, core.ticks()) > 0:
# On the main running queue but scheduled in the future, so bring it forward to now.
core._task_queue.remove(self)
core._task_queue.push(self)
self.data = core.CancelledError
return True

View File

@@ -0,0 +1,57 @@
# SPDX-FileCopyrightText: 2019-2020 Damien P. George
#
# SPDX-License-Identifier: MIT
#
# MicroPython uasyncio module
# MIT license; Copyright (c) 2019-2020 Damien P. George
"""
Fallback traceback module if the system traceback is missing.
"""
try:
from typing import List
except ImportError:
pass
import sys
def _print_traceback(traceback, limit=None, file=sys.stderr) -> List[str]:
if limit is None:
if hasattr(sys, "tracebacklimit"):
limit = sys.tracebacklimit
n = 0
while traceback is not None:
frame = traceback.tb_frame
line_number = traceback.tb_lineno
frame_code = frame.f_code
filename = frame_code.co_filename
name = frame_code.co_name
print(' File "%s", line %d, in %s' % (filename, line_number, name), file=file)
traceback = traceback.tb_next
n = n + 1
if limit is not None and n >= limit:
break
def print_exception(exception, value=None, traceback=None, limit=None, file=sys.stderr):
"""
Print exception information and stack trace to file.
"""
if traceback:
print("Traceback (most recent call last):", file=file)
_print_traceback(traceback, limit=limit, file=file)
if isinstance(exception, BaseException):
exception_type = type(exception).__name__
elif hasattr(exception, "__name__"):
exception_type = exception.__name__
else:
exception_type = type(value).__name__
valuestr = str(value)
if value is None or not valuestr:
print(exception_type, file=file)
else:
print("%s: %s" % (str(exception_type), valuestr), file=file)

View File

@@ -13,12 +13,6 @@ class Stream:
def get_extra_info(self, v):
return self.e[v]
async def __aenter__(self):
return self
async def __aexit__(self, exc_type, exc, tb):
await self.close()
def close(self):
pass
@@ -63,6 +57,8 @@ class Stream:
while True:
yield core._io_queue.queue_read(self.s)
l2 = self.s.readline() # may do multiple reads but won't block
if l2 is None:
continue
l += l2
if not l2 or l[-1] == 10: # \n (check l in case l2 is str)
return l
@@ -100,19 +96,29 @@ StreamWriter = Stream
# Create a TCP stream connection to a remote host
#
# async
def open_connection(host, port):
def open_connection(host, port, ssl=None, server_hostname=None):
from errno import EINPROGRESS
import socket
ai = socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM)[0] # TODO this is blocking!
s = socket.socket(ai[0], ai[1], ai[2])
s.setblocking(False)
ss = Stream(s)
try:
s.connect(ai[-1])
except OSError as er:
if er.errno != EINPROGRESS:
raise er
# wrap with SSL, if requested
if ssl:
if ssl is True:
import ssl as _ssl
ssl = _ssl.SSLContext(_ssl.PROTOCOL_TLS_CLIENT)
if not server_hostname:
server_hostname = host
s = ssl.wrap_socket(s, server_hostname=server_hostname, do_handshake_on_connect=False)
s.setblocking(False)
ss = Stream(s)
yield core._io_queue.queue_write(s)
return ss, ss
@@ -135,7 +141,7 @@ class Server:
async def wait_closed(self):
await self.task
async def _serve(self, s, cb):
async def _serve(self, s, cb, ssl):
self.state = False
# Accept incoming connections
while True:
@@ -156,6 +162,13 @@ class Server:
except:
# Ignore a failed accept
continue
if ssl:
try:
s2 = ssl.wrap_socket(s2, server_side=True, do_handshake_on_connect=False)
except OSError as e:
core.sys.print_exception(e)
s2.close()
continue
s2.setblocking(False)
s2s = Stream(s2, {"peername": addr})
core.create_task(cb(s2s, s2s))
@@ -163,7 +176,7 @@ class Server:
# Helper function to start a TCP stream server, running as a new task
# TODO could use an accept-callback on socket read activity instead of creating a task
async def start_server(cb, host, port, backlog=5):
async def start_server(cb, host, port, backlog=5, ssl=None):
import socket
# Create and bind server socket.
@@ -176,7 +189,7 @@ async def start_server(cb, host, port, backlog=5):
# Create and return server object and task.
srv = Server()
srv.task = core.create_task(srv._serve(s, cb))
srv.task = core.create_task(srv._serve(s, cb, ssl))
try:
# Ensure that the _serve task has been scheduled so that it gets to
# handle cancellation.

View File

@@ -1,79 +0,0 @@
from utime import *
from micropython import const
_TS_YEAR = const(0)
_TS_MON = const(1)
_TS_MDAY = const(2)
_TS_HOUR = const(3)
_TS_MIN = const(4)
_TS_SEC = const(5)
_TS_WDAY = const(6)
_TS_YDAY = const(7)
_TS_ISDST = const(8)
_WDAY = const(("Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"))
_MDAY = const(
(
"January",
"February",
"March",
"April",
"May",
"June",
"July",
"August",
"September",
"October",
"November",
"December",
)
)
def strftime(datefmt, ts):
from io import StringIO
fmtsp = False
ftime = StringIO()
for k in datefmt:
if fmtsp:
if k == "a":
ftime.write(_WDAY[ts[_TS_WDAY]][0:3])
elif k == "A":
ftime.write(_WDAY[ts[_TS_WDAY]])
elif k == "b":
ftime.write(_MDAY[ts[_TS_MON] - 1][0:3])
elif k == "B":
ftime.write(_MDAY[ts[_TS_MON] - 1])
elif k == "d":
ftime.write("%02d" % ts[_TS_MDAY])
elif k == "H":
ftime.write("%02d" % ts[_TS_HOUR])
elif k == "I":
ftime.write("%02d" % (ts[_TS_HOUR] % 12))
elif k == "j":
ftime.write("%03d" % ts[_TS_YDAY])
elif k == "m":
ftime.write("%02d" % ts[_TS_MON])
elif k == "M":
ftime.write("%02d" % ts[_TS_MIN])
elif k == "P":
ftime.write("AM" if ts[_TS_HOUR] < 12 else "PM")
elif k == "S":
ftime.write("%02d" % ts[_TS_SEC])
elif k == "w":
ftime.write(str(ts[_TS_WDAY]))
elif k == "y":
ftime.write("%02d" % (ts[_TS_YEAR] % 100))
elif k == "Y":
ftime.write(str(ts[_TS_YEAR]))
else:
ftime.write(k)
fmtsp = False
elif k == "%":
fmtsp = True
else:
ftime.write(k)
val = ftime.getvalue()
ftime.close()
return val

View File

@@ -1,6 +1,6 @@
[project]
name = "microdot"
version = "2.0.1"
version = "2.3.3"
authors = [
{ name = "Miguel Grinberg", email = "miguel.grinberg@gmail.com" },
]
@@ -14,6 +14,8 @@ classifiers = [
"Operating System :: OS Independent",
]
requires-python = ">=3.8"
dependencies = [
]
[project.readme]
file = "README.md"
@@ -24,8 +26,12 @@ Homepage = "https://github.com/miguelgrinberg/microdot"
"Bug Tracker" = "https://github.com/miguelgrinberg/microdot/issues"
[project.optional-dependencies]
dev = [
"tox",
]
docs = [
"sphinx",
"pyjwt",
]
[tool.setuptools]

View File

@@ -2,7 +2,11 @@ import sys
sys.path.insert(0, 'src')
sys.path.insert(2, 'libs/common')
sys.path.insert(3, 'libs/micropython')
if sys.implementation.name == 'circuitpython':
sys.path.insert(3, 'libs/circuitpython')
sys.path.insert(4, 'libs/micropython')
else:
sys.path.insert(3, 'libs/micropython')
import unittest

View File

@@ -1,2 +1,2 @@
from microdot.microdot import Microdot, Request, Response, abort, redirect, \
send_file # noqa: F401
send_file, URLPattern, AsyncBytesIO, iscoroutine # noqa: F401

View File

@@ -45,6 +45,12 @@ class _BodyStream: # pragma: no cover
class Microdot(BaseMicrodot):
"""A subclass of the core :class:`Microdot <microdot.Microdot>` class that
implements the ASGI protocol.
This class must be used as the application instance when running under an
ASGI web server.
"""
def __init__(self):
super().__init__()
self.embedded_server = False

162
src/microdot/auth.py Normal file
View File

@@ -0,0 +1,162 @@
from microdot import abort
from microdot.microdot import invoke_handler
class BaseAuth:
def __init__(self):
self.auth_callback = None
self.error_callback = None
def __call__(self, f):
"""Decorator to protect a route with authentication.
An instance of this class must be used as a decorator on the routes
that need to be protected. Example::
auth = BasicAuth() # or TokenAuth()
@app.route('/protected')
@auth
def protected(request):
# ...
Routes that are decorated in this way will only be invoked if the
authentication callback returned a valid user object, otherwise the
error callback will be executed.
"""
async def wrapper(request, *args, **kwargs):
auth = self._get_auth(request)
if not auth:
return await invoke_handler(self.error_callback, request)
request.g.current_user = await invoke_handler(
self.auth_callback, request, *auth)
if not request.g.current_user:
return await invoke_handler(self.error_callback, request)
return await invoke_handler(f, request, *args, **kwargs)
return wrapper
def optional(self, f):
"""Decorator to protect a route with optional authentication.
This decorator makes authentication for the decorated route optional,
meaning that the route is allowed to run with or with
authentication given in the request.
"""
async def wrapper(request, *args, **kwargs):
auth = self._get_auth(request)
if not auth:
request.g.current_user = None
else:
request.g.current_user = await invoke_handler(
self.auth_callback, request, *auth)
return await invoke_handler(f, request, *args, **kwargs)
return wrapper
class BasicAuth(BaseAuth):
"""Basic Authentication.
:param realm: The realm that is displayed when the user is prompted to
authenticate in the browser.
:param charset: The charset that is used to encode the realm.
:param scheme: The authentication scheme. Defaults to 'Basic'.
:param error_status: The error status code to return when authentication
fails. Defaults to 401.
"""
def __init__(self, realm='Please login', charset='UTF-8', scheme='Basic',
error_status=401):
super().__init__()
self.realm = realm
self.charset = charset
self.scheme = scheme
self.error_status = error_status
self.error_callback = self.authentication_error
def _get_auth(self, request):
auth = request.headers.get('Authorization')
if auth and auth.startswith('Basic '):
import binascii
try:
username, password = binascii.a2b_base64(
auth[6:]).decode().split(':', 1)
except Exception: # pragma: no cover
return None
return username, password
async def authentication_error(self, request):
return '', self.error_status, {
'WWW-Authenticate': '{} realm="{}", charset="{}"'.format(
self.scheme, self.realm, self.charset)}
def authenticate(self, f):
"""Decorator to configure the authentication callback.
This decorator must be used with a function that accepts the request
object, a username and a password and returns a user object if the
credentials are valid, or ``None`` if they are not. Example::
@auth.authenticate
async def check_credentials(request, username, password):
user = get_user(username)
if user and user.check_password(password):
return get_user(username)
"""
self.auth_callback = f
class TokenAuth(BaseAuth):
"""Token based authentication.
:param header: The name of the header that will contain the token. Defaults
to 'Authorization'.
:param scheme: The authentication scheme. Defaults to 'Bearer'.
:param error_status: The error status code to return when authentication
fails. Defaults to 401.
"""
def __init__(self, header='Authorization', scheme='Bearer',
error_status=401):
super().__init__()
self.header = header
self.scheme = scheme.lower()
self.error_status = error_status
self.error_callback = self.authentication_error
def _get_auth(self, request):
auth = request.headers.get(self.header)
if auth:
if self.header == 'Authorization':
try:
scheme, token = auth.split(' ', 1)
except Exception:
return None
if scheme.lower() == self.scheme:
return (token.strip(),)
else:
return (auth,)
def authenticate(self, f):
"""Decorator to configure the authentication callback.
This decorator must be used with a function that accepts the request
object, a username and a password and returns a user object if the
credentials are valid, or ``None`` if they are not. Example::
@auth.authenticate
async def check_credentials(request, token):
return get_user(token)
"""
self.auth_callback = f
def errorhandler(self, f):
"""Decorator to configure the error callback.
Microdot calls the error callback to allow the application to generate
a custom error response. The default error response is to call
``abort(401)``.
"""
self.error_callback = f
async def authentication_error(self, request):
abort(self.error_status)

8
src/microdot/helpers.py Normal file
View File

@@ -0,0 +1,8 @@
try:
from functools import wraps
except ImportError: # pragma: no cover
# MicroPython does not currently implement functools.wraps
def wraps(wrapped):
def _(wrapper):
return wrapper
return _

View File

@@ -1,19 +1,27 @@
from jinja2 import Environment, FileSystemLoader, select_autoescape
_jinja_env = None
class Template:
"""A template object.
:param template: The filename of the template to render, relative to the
configured template directory.
:param kwargs: any additional options to be passed to the Jinja
environment's ``get_template()`` method.
"""
#: The Jinja environment. The ``initialize()`` method must be called before
#: this attribute is accessed.
jinja_env = None
@classmethod
def initialize(cls, template_dir='templates', enable_async=False,
**kwargs):
"""Initialize the templating subsystem.
This method is automatically invoked when the first template is
created. The application can call it explicitly if custom options need
to be provided.
:param template_dir: the directory where templates are stored. This
argument is optional. The default is to load
templates from a *templates* subdirectory.
@@ -23,20 +31,19 @@ class Template:
:param kwargs: any additional options to be passed to Jinja's
``Environment`` class.
"""
global _jinja_env
_jinja_env = Environment(
cls.jinja_env = Environment(
loader=FileSystemLoader(template_dir),
autoescape=select_autoescape(),
enable_async=enable_async,
**kwargs
)
def __init__(self, template):
if _jinja_env is None: # pragma: no cover
def __init__(self, template, **kwargs):
if self.jinja_env is None: # pragma: no cover
self.initialize()
#: The name of the template
#: The name of the template.
self.name = template
self.template = _jinja_env.get_template(template)
self.template = self.jinja_env.get_template(template, **kwargs)
def generate(self, *args, **kwargs):
"""Return a generator that renders the template in chunks, with the

163
src/microdot/login.py Normal file
View File

@@ -0,0 +1,163 @@
from time import time
from microdot import redirect
from microdot.microdot import urlencode, invoke_handler
class Login:
"""User login support for Microdot.
:param login_url: the URL to redirect to when a login is required. The
default is '/login'.
"""
def __init__(self, login_url='/login'):
self.login_url = login_url
self.user_loader_callback = None
def user_loader(self, f):
"""Decorator to configure the user callback.
The decorated function receives the user ID as an argument and must
return the corresponding user object, or ``None`` if the user ID is
invalid.
"""
self.user_loader_callback = f
def _get_session(self, request):
return request.app._session.get(request)
def _update_remember_cookie(self, request, days, user_id=None):
remember_payload = request.app._session.encode({
'user_id': user_id,
'days': days,
'exp': time() + days * 24 * 60 * 60
})
@request.after_request
async def _set_remember_cookie(request, response):
response.set_cookie('_remember', remember_payload,
max_age=days * 24 * 60 * 60)
return response
def _get_user_id_from_session(self, request):
session = self._get_session(request)
if session and '_user_id' in session:
return session['_user_id']
if '_remember' in request.cookies:
remember_payload = request.app._session.decode(
request.cookies['_remember'])
user_id = remember_payload.get('user_id')
if user_id: # pragma: no branch
self._update_remember_cookie(
request, remember_payload.get('_days', 30), user_id)
session['_user_id'] = user_id
session['_fresh'] = False
session.save()
return user_id
async def _redirect_to_login(self, request):
return '', 302, {'Location': self.login_url + '?next=' + urlencode(
request.url)}
async def login_user(self, request, user, remember=False,
redirect_url='/'):
"""Log a user in.
:param request: the request object
:param user: the user object
:param remember: if the user's logged in state should be remembered
with a cookie after the session ends. Set to the
number of days the remember cookie should last, or to
``True`` to use a default duration of 30 days.
:param redirect_url: the URL to redirect to after login
This call marks the user as logged in by storing their user ID in the
user session. The application must call this method to log a user in
after their credentials have been validated.
The method returns a redirect response, either to the URL the user
originally intended to visit, or if there is no original URL to the URL
specified by the `redirect_url`.
"""
session = self._get_session(request)
session['_user_id'] = user.id
session['_fresh'] = True
session.save()
if remember:
days = 30 if remember is True else int(remember)
self._update_remember_cookie(request, days, session['_user_id'])
next_url = request.args.get('next', redirect_url)
if not next_url.startswith('/'):
next_url = redirect_url
return redirect(next_url)
async def logout_user(self, request):
"""Log a user out.
:param request: the request object
This call removes information about the user's log in from the user
session. If a remember cookie exists, it is removed as well.
"""
session = self._get_session(request)
session.pop('_user_id', None)
session.pop('_fresh', None)
session.save()
if '_remember' in request.cookies:
self._update_remember_cookie(request, 0)
def __call__(self, f):
"""Decorator to protect a route with authentication.
If the user is not logged in, Microdot will redirect to the login page
first. The decorated route will only run after successful login by the
user. If the user is already logged in, the route will run immediately.
Example::
login = Login()
@app.route('/secret')
@login
async def secret(request):
# only accessible to authenticated users
"""
async def wrapper(request, *args, **kwargs):
user_id = self._get_user_id_from_session(request)
if not user_id:
return await self._redirect_to_login(request)
request.g.current_user = await invoke_handler(
self.user_loader_callback, user_id)
if not request.g.current_user:
return await self._redirect_to_login(request)
return await invoke_handler(f, request, *args, **kwargs)
return wrapper
def fresh(self, f):
"""Decorator to protect a route with "fresh" authentication.
This decorator prevents the route from running when the login session
is not fresh. A fresh session is a session that has been created from
direct user interaction with the login page, while a non-fresh session
occurs when a login is restored from a "remember me" cookie. Example::
login = Login()
@app.route('/secret')
@auth.fresh
async def secret(request):
# only accessible to authenticated users
# users logged in via remember me cookie will need to
# re-authenticate
"""
base_wrapper = self.__call__(f)
async def wrapper(request, *args, **kwargs):
session = self._get_session(request)
if session.get('_fresh'):
return await base_wrapper(request, *args, **kwargs)
return await self._redirect_to_login(request)
return wrapper

View File

@@ -7,12 +7,17 @@ servers for MicroPython and standard Python.
"""
import asyncio
import io
import json
import re
import time
try:
import orjson as json
except ImportError:
import json
try:
from inspect import iscoroutinefunction, iscoroutine
from functools import partial
async def invoke_handler(handler, *args, **kwargs):
"""Invoke a handler and return the result.
@@ -23,7 +28,7 @@ try:
ret = await handler(*args, **kwargs)
else:
ret = await asyncio.get_running_loop().run_in_executor(
None, handler, *args, **kwargs)
None, partial(handler, *args, **kwargs))
return ret
except ImportError: # pragma: no cover
def iscoroutine(coro):
@@ -56,23 +61,9 @@ MUTED_SOCKET_ERRORS = [
]
def urldecode_str(s):
s = s.replace('+', ' ')
parts = s.split('%')
if len(parts) == 1:
return s
result = [parts[0]]
for item in parts[1:]:
if item == '':
result.append('%')
else:
code = item[:2]
result.append(chr(int(code, 16)))
result.append(item[2:])
return ''.join(result)
def urldecode_bytes(s):
def urldecode(s):
if isinstance(s, str):
s = s.encode()
s = s.replace(b'+', b' ')
parts = s.split(b'%')
if len(parts) == 1:
@@ -329,7 +320,8 @@ class Request:
pass
def __init__(self, app, client_addr, method, url, http_version, headers,
body=None, stream=None, sock=None):
body=None, stream=None, sock=None, url_prefix='',
subapp=None):
#: The application instance to which this request belongs.
self.app = app
#: The address of the client, as a tuple (host, port).
@@ -338,6 +330,12 @@ class Request:
self.method = method
#: The request URL, including the path and query string.
self.url = url
#: The URL prefix, if the endpoint comes from a mounted
#: sub-application, or else ''.
self.url_prefix = url_prefix
#: The sub-application instance, or `None` if this isn't a mounted
#: endpoint.
self.subapp = subapp
#: The path portion of the URL.
self.path = url
#: The query string portion of the URL.
@@ -377,6 +375,7 @@ class Request:
self.sock = sock
self._json = None
self._form = None
self._files = None
self.after_request_handlers = []
@staticmethod
@@ -433,12 +432,12 @@ class Request:
if isinstance(urlencoded, str):
for kv in [pair.split('=', 1)
for pair in urlencoded.split('&') if pair]:
data[urldecode_str(kv[0])] = urldecode_str(kv[1]) \
data[urldecode(kv[0])] = urldecode(kv[1]) \
if len(kv) > 1 else ''
elif isinstance(urlencoded, bytes): # pragma: no branch
for kv in [pair.split(b'=', 1)
for pair in urlencoded.split(b'&') if pair]:
data[urldecode_bytes(kv[0])] = urldecode_bytes(kv[1]) \
data[urldecode(kv[0])] = urldecode(kv[1]) \
if len(kv) > 1 else b''
return data
@@ -471,7 +470,13 @@ class Request:
def form(self):
"""The parsed form submission body, as a
:class:`MultiDict <microdot.MultiDict>` object, or ``None`` if the
request does not have a form submission."""
request does not have a form submission.
Forms that are URL encoded are processed by default. For multipart
forms to be processed, the
:func:`with_form_data <microdot.multipart.with_form_data>`
decorator must be added to the route.
"""
if self._form is None:
if self.content_type is None:
return None
@@ -481,6 +486,17 @@ class Request:
self._form = self._parse_urlencoded(self.body)
return self._form
@property
def files(self):
"""The files uploaded in the request as a dictionary, or ``None`` if
the request does not have any files.
The :func:`with_form_data <microdot.multipart.with_form_data>`
decorator must be added to the route that receives file uploads for
this property to be set.
"""
return self._files
def after_request(self, f):
"""Register a request-specific function to run after the request is
handled. Request-specific after request handlers run at the very end,
@@ -538,6 +554,7 @@ class Response:
'json': 'application/json',
'png': 'image/png',
'txt': 'text/plain',
'svg': 'image/svg+xml',
}
send_file_buffer_size = 1024
@@ -562,9 +579,9 @@ class Response:
self.headers = NoCaseDict(headers or {})
self.reason = reason
if isinstance(body, (dict, list)):
self.body = json.dumps(body).encode()
body = json.dumps(body)
self.headers['Content-Type'] = 'application/json; charset=UTF-8'
elif isinstance(body, str):
if isinstance(body, str):
self.body = body.encode()
else:
# this applies to bytes, file-like objects or generators
@@ -595,10 +612,10 @@ class Response:
if expires:
if isinstance(expires, str):
http_cookie += '; Expires=' + expires
else:
else: # pragma: no cover
http_cookie += '; Expires=' + time.strftime(
'%a, %d %b %Y %H:%M:%S GMT', expires.timetuple())
if max_age:
if max_age is not None:
http_cookie += '; Max-Age=' + str(max_age)
if secure:
http_cookie += '; Secure'
@@ -616,10 +633,10 @@ class Response:
:param cookie: The cookie's name.
:param kwargs: Any cookie opens and flags supported by
``set_cookie()`` except ``expires``.
``set_cookie()`` except ``expires`` and ``max_age``.
"""
self.set_cookie(cookie, '', expires='Thu, 01 Jan 1970 00:00:01 GMT',
**kwargs)
max_age=0, **kwargs)
def complete(self):
if isinstance(self.body, bytes) and \
@@ -774,7 +791,10 @@ class Response:
first.
"""
if content_type is None:
ext = filename.split('.')[-1]
if compressed and filename.endswith('.gz'):
ext = filename[:-3].split('.')[-1]
else:
ext = filename.split('.')[-1]
if ext in Response.types_map:
content_type = Response.types_map[ext]
else:
@@ -795,12 +815,54 @@ class Response:
class URLPattern():
"""A class that represents the URL pattern for a route.
:param url_pattern: The route URL pattern, which can include static and
dynamic path segments. Dynamic segments are enclosed in
``<`` and ``>``. The type of the segment can be given
as a prefix, separated from the name with a colon.
Supported types are ``string`` (the default),
``int`` and ``path``. Custom types can be registered
using the :meth:`URLPattern.register_type` method.
"""
segment_patterns = {
'string': '/([^/]+)',
'int': '/(-?\\d+)',
'path': '/(.+)',
}
segment_parsers = {
'int': lambda value: int(value),
}
@classmethod
def register_type(cls, type_name, pattern='[^/]+', parser=None):
"""Register a new URL segment type.
:param type_name: The name of the segment type to register.
:param pattern: The regular expression pattern to use when matching
this segment type. If not given, a default matcher for
a single path segment is used.
:param parser: A callable that will be used to parse and transform the
value of the segment. If omitted, the value is returned
as a string.
"""
cls.segment_patterns[type_name] = '/({})'.format(pattern)
cls.segment_parsers[type_name] = parser
def __init__(self, url_pattern):
self.url_pattern = url_pattern
self.pattern = ''
self.args = []
use_regex = False
for segment in url_pattern.lstrip('/').split('/'):
self.segments = []
self.regex = None
def compile(self):
"""Generate a regular expression for the URL pattern.
This method is automatically invoked the first time the URL pattern is
matched against a path.
"""
pattern = ''
for segment in self.url_pattern.lstrip('/').split('/'):
if segment and segment[0] == '<':
if segment[-1] != '>':
raise ValueError('invalid URL pattern')
@@ -810,42 +872,48 @@ class URLPattern():
else:
type_ = 'string'
name = segment
if type_ == 'string':
pattern = '[^/]+'
elif type_ == 'int':
pattern = '-?\\d+'
elif type_ == 'path':
pattern = '.+'
elif type_.startswith('re:'):
pattern = type_[3:]
parser = None
if type_.startswith('re:'):
pattern += '/({pattern})'.format(pattern=type_[3:])
else:
raise ValueError('invalid URL segment type')
use_regex = True
self.pattern += '/({pattern})'.format(pattern=pattern)
self.args.append({'type': type_, 'name': name})
if type_ not in self.segment_patterns:
raise ValueError('invalid URL segment type')
pattern += self.segment_patterns[type_]
parser = self.segment_parsers.get(type_)
self.segments.append({'parser': parser, 'name': name,
'type': type_})
else:
self.pattern += '/{segment}'.format(segment=segment)
if use_regex:
self.pattern = re.compile('^' + self.pattern + '$')
pattern += '/' + segment
self.segments.append({'parser': None})
self.regex = re.compile('^' + pattern + '$')
return self.regex
def match(self, path):
if isinstance(self.pattern, str):
if path != self.pattern:
return
return {}
g = self.pattern.match(path)
"""Match a path against the URL pattern.
Returns a dictionary with the values of all dynamic path segments if a
matche is found, or ``None`` if the path does not match this pattern.
"""
args = {}
g = (self.regex or self.compile()).match(path)
if not g:
return
args = {}
i = 1
for arg in self.args:
value = g.group(i)
if arg['type'] == 'int':
value = int(value)
args[arg['name']] = value
for segment in self.segments:
if 'name' not in segment:
continue
arg = g.group(i)
if segment['parser']:
arg = self.segment_parsers[segment['type']](arg)
if arg is None:
return
args[segment['name']] = arg
i += 1
return args
def __repr__(self): # pragma: no cover
return 'URLPattern: {}'.format(self.url_pattern)
class HTTPException(Exception):
def __init__(self, status_code, reason=None):
@@ -914,7 +982,7 @@ class Microdot:
def decorated(f):
self.url_map.append(
([m.upper() for m in (methods or ['GET'])],
URLPattern(url_pattern), f))
URLPattern(url_pattern), f, '', None))
return f
return decorated
@@ -1082,24 +1150,33 @@ class Microdot:
return f
return decorated
def mount(self, subapp, url_prefix=''):
def mount(self, subapp, url_prefix='', local=False):
"""Mount a sub-application, optionally under the given URL prefix.
:param subapp: The sub-application to mount.
:param url_prefix: The URL prefix to mount the application under.
:param local: When set to ``True``, the before, after and error request
handlers only apply to endpoints defined in the
sub-application. When ``False``, they apply to the entire
application. The default is ``False``.
"""
for methods, pattern, handler in subapp.url_map:
for methods, pattern, handler, _prefix, _subapp in subapp.url_map:
self.url_map.append(
(methods, URLPattern(url_prefix + pattern.url_pattern),
handler))
for handler in subapp.before_request_handlers:
self.before_request_handlers.append(handler)
for handler in subapp.after_request_handlers:
self.after_request_handlers.append(handler)
for handler in subapp.after_error_request_handlers:
self.after_error_request_handlers.append(handler)
for status_code, handler in subapp.error_handlers.items():
self.error_handlers[status_code] = handler
handler, url_prefix + _prefix, _subapp or subapp))
if not local:
for handler in subapp.before_request_handlers:
self.before_request_handlers.append(handler)
subapp.before_request_handlers = []
for handler in subapp.after_request_handlers:
self.after_request_handlers.append(handler)
subapp.after_request_handlers = []
for handler in subapp.after_error_request_handlers:
self.after_error_request_handlers.append(handler)
subapp.after_error_request_handlers = []
for status_code, handler in subapp.error_handlers.items():
self.error_handlers[status_code] = handler
subapp.error_handlers = {}
@staticmethod
def abort(status_code, reason=None):
@@ -1149,7 +1226,7 @@ class Microdot:
Example::
import asyncio
from microdot_asyncio import Microdot
from microdot import Microdot
app = Microdot()
@@ -1226,7 +1303,7 @@ class Microdot:
Example::
from microdot_asyncio import Microdot
from microdot import Microdot
app = Microdot()
@@ -1257,23 +1334,28 @@ class Microdot:
def find_route(self, req):
method = req.method.upper()
if method == 'OPTIONS' and self.options_handler:
return self.options_handler(req)
return self.options_handler(req), '', None
if method == 'HEAD':
method = 'GET'
f = 404
for route_methods, route_pattern, route_handler in self.url_map:
p = ''
s = None
for route_methods, route_pattern, route_handler, url_prefix, subapp \
in self.url_map:
req.url_args = route_pattern.match(req.path)
if req.url_args is not None:
p = url_prefix
s = subapp
if method in route_methods:
f = route_handler
break
else:
f = 405
return f
return f, p, s
def default_options_handler(self, req):
allow = []
for route_methods, route_pattern, route_handler in self.url_map:
for route_methods, route_pattern, _, _, _ in self.url_map:
if route_pattern.match(req.path) is not None:
allow.extend(route_methods)
if 'GET' in allow:
@@ -1290,9 +1372,9 @@ class Microdot:
print_exception(exc)
res = await self.dispatch_request(req)
if res != Response.already_handled: # pragma: no branch
await res.write(writer)
try:
if res != Response.already_handled: # pragma: no branch
await res.write(writer)
await writer.aclose()
except OSError as exc: # pragma: no cover
if exc.errno in MUTED_SOCKET_ERRORS:
@@ -1304,38 +1386,76 @@ class Microdot:
method=req.method, path=req.path,
status_code=res.status_code))
def get_request_handlers(self, req, attr, local_first=True):
handlers = getattr(self, attr + '_handlers')
local_handlers = getattr(req.subapp, attr + '_handlers') \
if req and req.subapp else []
return local_handlers + handlers if local_first \
else handlers + local_handlers
async def error_response(self, req, status_code, reason=None):
if req and req.subapp and status_code in req.subapp.error_handlers:
return await invoke_handler(
req.subapp.error_handlers[status_code], req)
elif status_code in self.error_handlers:
return await invoke_handler(self.error_handlers[status_code], req)
return reason or 'N/A', status_code
async def dispatch_request(self, req):
after_request_handled = False
if req:
if req.content_length > req.max_content_length:
if 413 in self.error_handlers:
res = await invoke_handler(self.error_handlers[413], req)
else:
res = 'Payload too large', 413
# the request body is larger than allowed
res = await self.error_response(req, 413, 'Payload too large')
else:
f = self.find_route(req)
# find the route in the app's URL map
f, req.url_prefix, req.subapp = self.find_route(req)
try:
res = None
if callable(f):
for handler in self.before_request_handlers:
# invoke the before request handlers
for handler in self.get_request_handlers(
req, 'before_request', False):
res = await invoke_handler(handler, req)
if res:
break
# invoke the endpoint handler
if res is None:
res = await invoke_handler(
f, req, **req.url_args)
res = await invoke_handler(f, req, **req.url_args)
# process the response
if isinstance(res, int):
# an integer response is taken as a status code
# with an empty body
res = '', res
if isinstance(res, tuple):
# handle a tuple response
if isinstance(res[0], int):
# a tuple that starts with an int has an empty
# body
res = ('', res[0],
res[1] if len(res) > 1 else {})
body = res[0]
if isinstance(res[1], int):
# extract the status code and headers (if
# available)
status_code = res[1]
headers = res[2] if len(res) > 2 else {}
else:
# if the status code is missing, assume 200
status_code = 200
headers = res[1]
res = Response(body, status_code, headers)
elif not isinstance(res, Response):
# any other response types are wrapped in a
# Response object
res = Response(res)
for handler in self.after_request_handlers:
# invoke the after request handlers
for handler in self.get_request_handlers(
req, 'after_request', True):
res = await invoke_handler(
handler, req, res) or res
for handler in req.after_request_handlers:
@@ -1343,50 +1463,62 @@ class Microdot:
handler, req, res) or res
after_request_handled = True
elif isinstance(f, dict):
# the response from an OPTIONS request is a dict with
# headers
res = Response(headers=f)
elif f in self.error_handlers:
res = await invoke_handler(self.error_handlers[f], req)
else:
res = 'Not found', f
# if the route is not found, return a 404 or 405
# response as appropriate
res = await self.error_response(req, f, 'Not found')
except HTTPException as exc:
if exc.status_code in self.error_handlers:
res = self.error_handlers[exc.status_code](req)
else:
res = exc.reason, exc.status_code
# an HTTP exception was raised while handling this request
res = await self.error_response(req, exc.status_code,
exc.reason)
except Exception as exc:
# an unexpected exception was raised while handling this
# request
print_exception(exc)
exc_class = None
# invoke the error handler for the exception class if one
# exists
handler = None
res = None
if exc.__class__ in self.error_handlers:
exc_class = exc.__class__
if req.subapp and exc.__class__ in \
req.subapp.error_handlers:
handler = req.subapp.error_handlers[exc.__class__]
elif exc.__class__ in self.error_handlers:
handler = self.error_handlers[exc.__class__]
else:
# walk up the exception class hierarchy to try to find
# a handler
for c in mro(exc.__class__)[1:]:
if c in self.error_handlers:
exc_class = c
if req.subapp and c in req.subapp.error_handlers:
handler = req.subapp.error_handlers[c]
break
if exc_class:
elif c in self.error_handlers:
handler = self.error_handlers[c]
break
if handler:
try:
res = await invoke_handler(
self.error_handlers[exc_class], req, exc)
res = await invoke_handler(handler, req, exc)
except Exception as exc2: # pragma: no cover
print_exception(exc2)
if res is None:
if 500 in self.error_handlers:
res = await invoke_handler(
self.error_handlers[500], req)
else:
res = 'Internal server error', 500
# if there is still no response, issue a 500 error
res = await self.error_response(
req, 500, 'Internal server error')
else:
if 400 in self.error_handlers:
res = await invoke_handler(self.error_handlers[400], req)
else:
res = 'Bad request', 400
# if the request could not be parsed, issue a 400 error
res = await self.error_response(req, 400, 'Bad request')
if isinstance(res, tuple):
res = Response(*res)
elif not isinstance(res, Response):
res = Response(res)
if not after_request_handled:
for handler in self.after_error_request_handlers:
# if the request did not finish due to an error, invoke the after
# error request handler
for handler in self.get_request_handlers(
req, 'after_error_request', True):
res = await invoke_handler(
handler, req, res) or res
res.is_head = (req and req.method == 'HEAD')

291
src/microdot/multipart.py Normal file
View File

@@ -0,0 +1,291 @@
import os
from random import choice
from microdot import abort, iscoroutine, AsyncBytesIO
from microdot.helpers import wraps
class FormDataIter:
"""Asynchronous iterator that parses a ``multipart/form-data`` body and
returns form fields and files as they are parsed.
:param request: the request object to parse.
Example usage::
from microdot.multipart import FormDataIter
@app.post('/upload')
async def upload(request):
async for name, value in FormDataIter(request):
print(name, value)
The iterator returns no values when the request has a content type other
than ``multipart/form-data``. For a file field, the returned value is of
type :class:`FileUpload`, which supports the
:meth:`read() <FileUpload.read>` and :meth:`save() <FileUpload.save>`
methods. Values for regular fields are provided as strings.
The request body is read efficiently in chunks of size
:attr:`buffer_size <FormDataIter.buffer_size>`. On iterations in which a
file field is encountered, the file must be consumed before moving on to
the next iteration, as the internal stream stored in ``FileUpload``
instances is invalidated at the end of the iteration.
"""
#: The size of the buffer used to read chunks of the request body.
buffer_size = 256
def __init__(self, request):
self.request = request
self.buffer = None
try:
mimetype, boundary = request.content_type.rsplit('; boundary=', 1)
except ValueError:
return # not a multipart request
if mimetype.split(';', 1)[0] == \
'multipart/form-data': # pragma: no branch
self.boundary = b'--' + boundary.encode()
self.extra_size = len(boundary) + 4
self.buffer = b''
def __aiter__(self):
return self
async def __anext__(self):
if self.buffer is None:
raise StopAsyncIteration
# make sure we have consumed the previous entry
while await self._read_buffer(self.buffer_size) != b'':
pass
# make sure we are at a boundary
s = self.buffer.split(self.boundary, 1)
if len(s) != 2 or s[0] != b'':
abort(400) # pragma: no cover
self.buffer = s[1]
if self.buffer[:2] == b'--':
# we have reached the end
raise StopAsyncIteration
elif self.buffer[:2] != b'\r\n':
abort(400) # pragma: no cover
self.buffer = self.buffer[2:]
# parse the headers of this part
name = ''
filename = None
content_type = None
while True:
await self._fill_buffer()
lines = self.buffer.split(b'\r\n', 1)
if len(lines) != 2:
abort(400) # pragma: no cover
line, self.buffer = lines
if line == b'':
# we reached the end of the headers
break
header, value = line.decode().split(':', 1)
header = header.lower()
value = value.strip()
if header == 'content-disposition':
parts = value.split(';')
if len(parts) < 2 or parts[0] != 'form-data':
abort(400) # pragma: no cover
for part in parts[1:]:
part = part.strip()
if part.startswith('name="'):
name = part[6:-1]
elif part.startswith('filename="'): # pragma: no branch
filename = part[10:-1]
elif header == 'content-type': # pragma: no branch
content_type = value
if filename is None:
# this is a regular form field, so we read the value
value = b''
while True:
v = await self._read_buffer(self.buffer_size)
value += v
if len(v) < self.buffer_size: # pragma: no branch
break
return name, value.decode()
return name, FileUpload(filename, content_type, self._read_buffer)
async def _fill_buffer(self):
self.buffer += await self.request.stream.read(
self.buffer_size + self.extra_size - len(self.buffer))
async def _read_buffer(self, n=-1):
data = b''
while n == -1 or len(data) < n:
await self._fill_buffer()
s = self.buffer.split(self.boundary, 1)
data += s[0][:n] if n != -1 else s[0]
self.buffer = s[0][n:] if n != -1 else b''
if len(s) == 2: # pragma: no branch
# the end of this part is in the buffer
if len(self.buffer) < 2:
# we have read all the way to the end of this part
data = data[:-(2 - len(self.buffer))] # remove last "\r\n"
self.buffer += self.boundary + s[1]
return data
return data
class FileUpload:
"""Class that represents an uploaded file.
:param filename: the name of the uploaded file.
:param content_type: the content type of the uploaded file.
:param read: a coroutine that reads from the uploaded file's stream.
An uploaded file can be read from the stream using the :meth:`read()`
method or saved to a file using the :meth:`save()` method.
Instances of this class do not normally need to be created directly.
"""
#: The size at which the file is copied to a temporary file.
max_memory_size = 1024
def __init__(self, filename, content_type, read):
self.filename = filename
self.content_type = content_type
self._read = read
self._close = None
async def read(self, n=-1):
"""Read up to ``n`` bytes from the uploaded file's stream.
:param n: the maximum number of bytes to read. If ``n`` is -1 or not
given, the entire file is read.
"""
return await self._read(n)
async def save(self, path_or_file):
"""Save the uploaded file to the given path or file object.
:param path_or_file: the path to save the file to, or a file object
to which the file is to be written.
The file is read and written in chunks of size
:attr:`FormDataIter.buffer_size`.
"""
if isinstance(path_or_file, str):
f = open(path_or_file, 'wb')
else:
f = path_or_file
while True:
data = await self.read(FormDataIter.buffer_size)
if not data:
break
f.write(data)
if f != path_or_file:
f.close()
async def copy(self, max_memory_size=None):
"""Copy the uploaded file to a temporary file, to allow the parsing of
the multipart form to continue.
:param max_memory_size: the maximum size of the file to keep in memory.
If not given, then the class attribute of the
same name is used.
"""
max_memory_size = max_memory_size or FileUpload.max_memory_size
buffer = await self.read(max_memory_size)
if len(buffer) < max_memory_size:
f = AsyncBytesIO(buffer)
self._read = f.read
return self
# create a temporary file
while True:
tmpname = "".join([
choice('abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ')
for _ in range(12)
])
try:
f = open(tmpname, 'x+b')
except OSError as e: # pragma: no cover
if e.errno == 17:
# EEXIST
continue
elif e.errno == 2:
# ENOENT
# some MicroPython platforms do not support mode "x"
f = open(tmpname, 'w+b')
if f.read(1) != b'':
f.close()
continue
else:
raise
break
f.write(buffer)
await self.save(f)
f.seek(0)
async def read(n=-1):
return f.read(n)
async def close():
f.close()
os.remove(tmpname)
self._read = read
self._close = close
return self
async def close(self):
"""Close an open file.
This method must be called to free memory or temporary files created by
the ``copy()`` method.
Note that when using the ``@with_form_data`` decorator this method is
called automatically when the request ends.
"""
if self._close:
await self._close()
self._close = None
def with_form_data(f):
"""Decorator that parses a ``multipart/form-data`` body and updates the
request object with the parsed form fields and files.
Example usage::
from microdot.multipart import with_form_data
@app.post('/upload')
@with_form_data
async def upload(request):
print('form fields:', request.form)
print('files:', request.files)
Note: this decorator calls the :meth:`FileUpload.copy()
<microdot.multipart.FileUpload.copy>` method on all uploaded files, so that
the request can be parsed in its entirety. The files are either copied to
memory or a temporary file, depending on their size. The temporary files
are automatically deleted when the request ends.
"""
@wraps(f)
async def wrapper(request, *args, **kwargs):
form = {}
files = {}
async for name, value in FormDataIter(request):
if isinstance(value, FileUpload):
files[name] = await value.copy()
else:
form[name] = value
if form or files:
request._form = form
request._files = files
try:
ret = f(request, *args, **kwargs)
if iscoroutine(ret):
ret = await ret
finally:
if request.files:
for file in request.files.values():
await file.close()
return ret
return wrapper

View File

@@ -1,7 +1,6 @@
import jwt
from microdot.microdot import invoke_handler
secret_key = None
from microdot.helpers import wraps
class SessionDict(dict):
@@ -30,14 +29,21 @@ class Session:
"""
secret_key = None
def __init__(self, app=None, secret_key=None):
def __init__(self, app=None, secret_key=None, cookie_options=None):
self.secret_key = secret_key
self.cookie_options = cookie_options or {}
if app is not None:
self.initialize(app)
def initialize(self, app, secret_key=None):
def initialize(self, app, secret_key=None, cookie_options=None):
if secret_key is not None:
self.secret_key = secret_key
if cookie_options is not None:
self.cookie_options = cookie_options
if 'path' not in self.cookie_options:
self.cookie_options['path'] = '/'
if 'http_only' not in self.cookie_options:
self.cookie_options['http_only'] = True
app._session = self
def get(self, request):
@@ -57,13 +63,7 @@ class Session:
if session is None:
request.g._session = SessionDict(request, {})
return request.g._session
try:
session = jwt.decode(session, self.secret_key,
algorithms=['HS256'])
except jwt.exceptions.PyJWTError: # pragma: no cover
request.g._session = SessionDict(request, {})
else:
request.g._session = SessionDict(request, session)
request.g._session = SessionDict(request, self.decode(session))
return request.g._session
def update(self, request, session):
@@ -89,12 +89,12 @@ class Session:
if not self.secret_key:
raise ValueError('The session secret key is not configured')
encoded_session = jwt.encode(session, self.secret_key,
algorithm='HS256')
encoded_session = self.encode(session)
@request.after_request
def _update_session(request, response):
response.set_cookie('session', encoded_session, http_only=True)
response.set_cookie('session', encoded_session,
**self.cookie_options)
return response
def delete(self, request):
@@ -117,10 +117,21 @@ class Session:
"""
@request.after_request
def _delete_session(request, response):
response.set_cookie('session', '', http_only=True,
expires='Thu, 01 Jan 1970 00:00:01 GMT')
response.delete_cookie('session', **self.cookie_options)
return response
def encode(self, payload, secret_key=None):
return jwt.encode(payload, secret_key or self.secret_key,
algorithm='HS256')
def decode(self, session, secret_key=None):
try:
payload = jwt.decode(session, secret_key or self.secret_key,
algorithms=['HS256'])
except jwt.exceptions.PyJWTError: # pragma: no cover
return {}
return payload
def with_session(f):
"""Decorator that passes the user session to the route handler.
@@ -134,15 +145,11 @@ def with_session(f):
return 'Hello, World!'
Note that the decorator does not save the session. To update the session,
call the :func:`update_session <microdot.session.update_session>` function.
call the :func:`session.save() <microdot.session.SessionDict.save>` method.
"""
@wraps(f)
async def wrapper(request, *args, **kwargs):
return await invoke_handler(
f, request, request.app._session.get(request), *args, **kwargs)
for attr in ['__name__', '__doc__', '__module__', '__qualname__']:
try:
setattr(wrapper, attr, getattr(f, attr))
except AttributeError: # pragma: no cover
pass
return wrapper

View File

@@ -1,20 +1,44 @@
import asyncio
import json
from microdot.helpers import wraps
try:
import orjson as json
except ImportError:
import json
class SSE:
"""Server-Sent Events object.
An object of this class is sent to handler functions to manage the SSE
connection.
"""
def __init__(self):
self.event = asyncio.Event()
self.queue = []
async def send(self, data, event=None):
async def send(self, data, event=None, event_id=None):
"""Send an event to the client.
:param data: the data to send. It can be given as a string, bytes, dict
or list. Dictionaries and lists are serialized to JSON.
Any other types are converted to string before sending.
:param event: an optional event name, to send along with the data. If
given, it must be a string.
:param event_id: an optional event id, to send along with the data. If
given, it must be a string.
"""
if isinstance(data, (dict, list)):
data = json.dumps(data)
elif not isinstance(data, str):
data = str(data)
data = f'data: {data}\n\n'
if isinstance(data, str):
data = data.encode()
elif not isinstance(data, bytes):
data = str(data).encode()
data = b'data: ' + data + b'\n\n'
if event_id:
data = b'id: ' + event_id.encode() + b'\n' + data
if event:
data = f'event: {event}\n{data}'
data = b'event: ' + event.encode() + b'\n' + data
self.queue.append(data)
self.event.set()
@@ -30,24 +54,21 @@ def sse_response(request, event_function, *args, **kwargs):
:param args: additional positional arguments to be passed to the response.
:param kwargs: additional keyword arguments to be passed to the response.
Example::
@app.route('/events')
async def events_route(request):
async def events(request, sse):
# send an unnamed event with string data
await sse.send('hello')
# send an unnamed event with JSON data
await sse.send({'foo': 'bar'})
# send a named event
await sse.send('hello', event='greeting')
return sse_response(request, events)
This is a low-level function that can be used to implement a custom SSE
endpoint. In general the :func:`microdot.sse.with_sse` decorator should be
used instead.
"""
sse = SSE()
async def sse_task_wrapper():
await event_function(request, sse, *args, **kwargs)
try:
await event_function(request, sse, *args, **kwargs)
except asyncio.CancelledError: # pragma: no cover
pass
except Exception as exc:
# the SSE task raised an exception so we need to pass it to the
# main route so that it is re-raised there
sse.queue.append(exc)
sse.event.set()
task = asyncio.create_task(sse_task_wrapper())
@@ -65,7 +86,11 @@ def sse_response(request, event_function, *args, **kwargs):
except IndexError:
await sse.event.wait()
sse.event.clear()
if event is None:
if isinstance(event, Exception):
# if the event is an exception we re-raise it here so that it
# can be handled appropriately
raise event
elif event is None:
raise StopAsyncIteration
return event
@@ -85,10 +110,16 @@ def with_sse(f):
@app.route('/events')
@with_sse
async def events(request, sse):
for i in range(10):
await asyncio.sleep(1)
await sse.send(f'{i}')
# send an unnamed event with string data
await sse.send('hello')
# send an unnamed event with JSON data
await sse.send({'foo': 'bar'})
# send a named event
await sse.send('hello', event='greeting')
"""
@wraps(f)
async def sse_handler(request, *args, **kwargs):
return sse_response(request, f, *args, **kwargs)

View File

@@ -1,4 +1,4 @@
import json
import asyncio
from microdot.microdot import Request, Response, AsyncBytesIO
try:
@@ -6,6 +6,11 @@ try:
except: # pragma: no cover # noqa: E722
WebSocket = None
try:
import orjson as json
except ImportError:
import json
__all__ = ['TestClient', 'TestResponse']
@@ -19,7 +24,7 @@ class TestResponse:
#: explicitly sets it on the response object.
self.reason = None
#: A dictionary with the response headers.
self.headers = None
self.headers = {}
#: The body of the response, as a bytes object.
self.body = None
#: The body of the response, decoded to a UTF-8 string. Set to
@@ -28,6 +33,11 @@ class TestResponse:
#: The body of the JSON response, decoded to a dictionary or list. Set
#: ``Note`` if the response does not have a JSON payload.
self.json = None
#: The body of the SSE response, decoded to a list of events, each
#: given as a dictionary with a ``data`` key and optionally also
#: ``event`` and ``id`` keys. Set to ``None`` if the response does not
#: have an SSE payload.
self.events = None
def _initialize_response(self, res):
self.status_code = res.status_code
@@ -37,10 +47,13 @@ class TestResponse:
async def _initialize_body(self, res):
self.body = b''
iter = res.body_iter()
async for body in iter: # pragma: no branch
if isinstance(body, str):
body = body.encode()
self.body += body
try:
async for body in iter: # pragma: no branch
if isinstance(body, str):
body = body.encode()
self.body += body
except asyncio.CancelledError: # pragma: no cover
pass
if hasattr(iter, 'aclose'): # pragma: no branch
await iter.aclose()
@@ -56,6 +69,32 @@ class TestResponse:
if content_type.split(';')[0] == 'application/json':
self.json = json.loads(self.text)
def _process_sse_body(self):
if 'Content-Type' in self.headers: # pragma: no branch
content_type = self.headers['Content-Type']
if content_type.split(';')[0] == 'text/event-stream':
self.events = []
for sse_event in self.body.split(b'\n\n'):
data = None
event = None
event_id = None
for line in sse_event.split(b'\n'):
if line.startswith(b'data:'):
data = line[5:].strip()
elif line.startswith(b'event:'):
event = line[6:].strip().decode()
elif line.startswith(b'id:'):
event_id = line[3:].strip().decode()
if data:
data_json = None
try:
data_json = json.loads(data)
except ValueError:
pass
self.events.append({
"data": data, "data_json": data_json,
"event": event, "event_id": event_id})
@classmethod
async def create(cls, res):
test_res = cls()
@@ -64,6 +103,7 @@ class TestResponse:
await test_res._initialize_body(res)
test_res._process_text_body()
test_res._process_json_body()
test_res._process_sse_body()
return test_res
@@ -77,7 +117,7 @@ class TestClient:
The following example shows how to create a test client for an application
and send a test request::
from microdot_asyncio import Microdot
from microdot import Microdot
app = Microdot()
@@ -101,10 +141,10 @@ class TestClient:
if body is None:
body = b''
elif isinstance(body, (dict, list)):
body = json.dumps(body).encode()
body = json.dumps(body)
if 'Content-Type' not in headers: # pragma: no cover
headers['Content-Type'] = 'application/json'
elif isinstance(body, str):
if isinstance(body, str):
body = body.encode()
if body and 'Content-Length' not in headers:
headers['Content-Length'] = str(len(body))
@@ -112,9 +152,13 @@ class TestClient:
headers['Host'] = 'example.com:1234'
return body, headers
def _process_cookies(self, headers):
def _process_cookies(self, path, headers):
cookies = ''
for name, value in self.cookies.items():
if isinstance(value, tuple):
value, cookie_path = value
if not path.startswith(cookie_path):
continue
if cookies:
cookies += '; '
cookies += name + '=' + value
@@ -123,7 +167,7 @@ class TestClient:
headers['Cookie'] += '; ' + cookies
else:
headers['Cookie'] = cookies
return cookies, headers
return headers
def _render_request(self, method, path, headers, body):
request_bytes = '{method} {path} HTTP/1.0\n'.format(
@@ -139,26 +183,45 @@ class TestClient:
for cookie in cookies:
cookie_name, cookie_value = cookie.split('=', 1)
cookie_options = cookie_value.split(';')
path = '/'
delete = False
for option in cookie_options[1:]:
if option.strip().lower().startswith('expires='):
_, e = option.strip().split('=', 1)
option = option.strip().lower()
if option.startswith(
'max-age='): # pragma: no cover
_, age = option.split('=', 1)
try:
age = int(age)
except ValueError: # pragma: no cover
age = 0
if age <= 0:
delete = True
elif option.startswith('expires='):
_, e = option.split('=', 1)
# this is a very limited parser for cookie expiry
# that only detects a cookie deletion request when
# the date is 1/1/1970
if '1 jan 1970' in e.lower(): # pragma: no branch
delete = True
break
elif option.startswith('path='):
_, path = option.split('=', 1)
if delete:
if cookie_name in self.cookies: # pragma: no branch
del self.cookies[cookie_name]
cookie_path = self.cookies[cookie_name][1] \
if isinstance(self.cookies[cookie_name], tuple) \
else '/'
if path == cookie_path:
del self.cookies[cookie_name]
else:
self.cookies[cookie_name] = cookie_options[0]
if path == '/':
self.cookies[cookie_name] = cookie_options[0]
else:
self.cookies[cookie_name] = (cookie_options[0], path)
async def request(self, method, path, headers=None, body=None, sock=None):
headers = headers or {}
body, headers = self._process_body(body, headers)
cookies, headers = self._process_cookies(headers)
headers = self._process_cookies(path, headers)
request_bytes = self._render_request(method, path, headers, body)
if sock:
reader = sock[0]
@@ -172,7 +235,7 @@ class TestClient:
('127.0.0.1', 1234))
res = await self.app.dispatch_request(req)
if res == Response.already_handled:
return None
return TestResponse()
res.complete()
self._update_cookies(res)
@@ -292,6 +355,8 @@ class TestClient:
async def awrite(self, data):
if self.started:
h = WebSocket._parse_frame_header(data[0:2])
if h[1] not in [WebSocket.TEXT, WebSocket.BINARY]:
return
if h[3] < 0:
data = data[2 - h[3]:]
else:

View File

@@ -1,10 +1,21 @@
import binascii
import hashlib
from microdot import Response
from microdot.microdot import MUTED_SOCKET_ERRORS
from microdot import Request, Response
from microdot.microdot import MUTED_SOCKET_ERRORS, print_exception
from microdot.helpers import wraps
class WebSocketError(Exception):
"""Exception raised when an error occurs in a WebSocket connection."""
pass
class WebSocket:
"""A WebSocket connection object.
An instance of this class is sent to handler functions to manage the
WebSocket connection.
"""
CONT = 0
TEXT = 1
BINARY = 2
@@ -12,6 +23,18 @@ class WebSocket:
PING = 9
PONG = 10
#: Specify the maximum message size that can be received when calling the
#: ``receive()`` method. Messages with payloads that are larger than this
#: size will be rejected and the connection closed. Set to 0 to disable
#: the size check (be aware of potential security issues if you do this),
#: or to -1 to use the value set in
#: ``Request.max_body_length``. The default is -1.
#:
#: Example::
#:
#: WebSocket.max_message_length = 4 * 1024 # up to 4KB messages
max_message_length = -1
def __init__(self, request):
self.request = request
self.closed = False
@@ -26,6 +49,7 @@ class WebSocket:
b'Sec-WebSocket-Accept: ' + response + b'\r\n\r\n')
async def receive(self):
"""Receive a message from the client."""
while True:
opcode, payload = await self._read_frame()
send_opcode, data = self._process_websocket_frame(opcode, payload)
@@ -35,12 +59,20 @@ class WebSocket:
return data
async def send(self, data, opcode=None):
"""Send a message to the client.
:param data: the data to send, given as a string or bytes.
:param opcode: a custom frame opcode to use. If not given, the opcode
is ``TEXT`` or ``BINARY`` depending on the type of the
data.
"""
frame = self._encode_websocket_frame(
opcode or (self.TEXT if isinstance(data, str) else self.BINARY),
data)
await self.request.sock[1].awrite(frame)
async def close(self):
"""Close the websocket connection."""
if not self.closed: # pragma: no cover
self.closed = True
await self.send(b'', self.CLOSE)
@@ -72,7 +104,7 @@ class WebSocket:
fin = header[0] & 0x80
opcode = header[0] & 0x0f
if fin == 0 or opcode == cls.CONT: # pragma: no cover
raise OSError(32, 'Continuation frames not supported')
raise WebSocketError('Continuation frames not supported')
has_mask = header[1] & 0x80
length = header[1] & 0x7f
if length == 126:
@@ -87,7 +119,7 @@ class WebSocket:
elif opcode == self.BINARY:
pass
elif opcode == self.CLOSE:
raise OSError(32, 'Websocket connection closed')
raise WebSocketError('Websocket connection closed')
elif opcode == self.PING:
return self.PONG, payload
elif opcode == self.PONG: # pragma: no branch
@@ -114,17 +146,21 @@ class WebSocket:
async def _read_frame(self):
header = await self.request.sock[0].read(2)
if len(header) != 2: # pragma: no cover
raise OSError(32, 'Websocket connection closed')
raise WebSocketError('Websocket connection closed')
fin, opcode, has_mask, length = self._parse_frame_header(header)
if length == -2:
length = await self.request.sock[0].read(2)
length = await self.request.sock[0].readexactly(2)
length = int.from_bytes(length, 'big')
elif length == -8:
length = await self.request.sock[0].read(8)
length = await self.request.sock[0].readexactly(8)
length = int.from_bytes(length, 'big')
max_allowed_length = Request.max_body_length \
if self.max_message_length == -1 else self.max_message_length
if length > max_allowed_length:
raise WebSocketError('Message too large')
if has_mask: # pragma: no cover
mask = await self.request.sock[0].read(4)
payload = await self.request.sock[0].read(length)
mask = await self.request.sock[0].readexactly(4)
payload = await self.request.sock[0].readexactly(length)
if has_mask: # pragma: no cover
payload = bytes(x ^ mask[i % 4] for i, x in enumerate(payload))
return opcode, payload
@@ -157,15 +193,24 @@ async def websocket_upgrade(request):
def websocket_wrapper(f, upgrade_function):
@wraps(f)
async def wrapper(request, *args, **kwargs):
ws = await upgrade_function(request)
try:
await f(request, ws, *args, **kwargs)
await ws.close() # pragma: no cover
except OSError as exc:
if exc.errno not in MUTED_SOCKET_ERRORS: # pragma: no cover
raise
return ''
except WebSocketError:
pass
except Exception as exc:
print_exception(exc)
finally: # pragma: no cover
try:
await ws.close()
except Exception:
pass
return Response.already_handled
return wrapper

View File

@@ -9,6 +9,12 @@ from microdot.websocket import WebSocket, websocket_upgrade, \
class Microdot(BaseMicrodot):
"""A subclass of the core :class:`Microdot <microdot.Microdot>` class that
implements the WSGI protocol.
This class must be used as the application instance when running under a
WSGI web server.
"""
def __init__(self):
super().__init__()
self.loop = asyncio.new_event_loop()

View File

@@ -4,8 +4,11 @@ from tests.test_request import * # noqa: F401, F403
from tests.test_response import * # noqa: F401, F403
from tests.test_urlencode import * # noqa: F401, F403
from tests.test_url_pattern import * # noqa: F401, F403
from tests.test_multipart import * # noqa: F401, F403
from tests.test_websocket import * # noqa: F401, F403
from tests.test_sse import * # noqa: F401, F403
from tests.test_cors import * # noqa: F401, F403
from tests.test_utemplate import * # noqa: F401, F403
from tests.test_session import * # noqa: F401, F403
from tests.test_auth import * # noqa: F401, F403
from tests.test_login import * # noqa: F401, F403

1
tests/files/test.txt.gz Normal file
View File

@@ -0,0 +1 @@
foo

190
tests/test_auth.py Normal file
View File

@@ -0,0 +1,190 @@
import asyncio
import binascii
import unittest
from microdot import Microdot
from microdot.auth import BasicAuth, TokenAuth
from microdot.test_client import TestClient
class TestAuth(unittest.TestCase):
@classmethod
def setUpClass(cls):
if hasattr(asyncio, 'set_event_loop'):
asyncio.set_event_loop(asyncio.new_event_loop())
cls.loop = asyncio.get_event_loop()
def _run(self, coro):
return self.loop.run_until_complete(coro)
def test_basic_auth(self):
app = Microdot()
basic_auth = BasicAuth()
@basic_auth.authenticate
def authenticate(request, username, password):
if username == 'foo' and password == 'bar':
return {'username': username}
@app.route('/')
@basic_auth
def index(request):
return request.g.current_user['username']
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={
'Authorization': 'Basic ' + binascii.b2a_base64(
b'foo:bar').decode()}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'foo')
res = self._run(client.get('/', headers={
'Authorization': 'Basic ' + binascii.b2a_base64(
b'foo:baz').decode()}))
self.assertEqual(res.status_code, 401)
def test_basic_optional_auth(self):
app = Microdot()
basic_auth = BasicAuth()
@basic_auth.authenticate
def authenticate(request, username, password):
if username == 'foo' and password == 'bar':
return {'username': username}
@app.route('/')
@basic_auth.optional
def index(request):
return request.g.current_user['username'] \
if request.g.current_user else ''
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, '')
res = self._run(client.get('/', headers={
'Authorization': 'Basic ' + binascii.b2a_base64(
b'foo:bar').decode()}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'foo')
res = self._run(client.get('/', headers={
'Authorization': 'Basic ' + binascii.b2a_base64(
b'foo:baz').decode()}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, '')
def test_token_auth(self):
app = Microdot()
token_auth = TokenAuth()
@token_auth.authenticate
def authenticate(request, token):
if token == 'foo':
return 'user'
@app.route('/')
@token_auth
def index(request):
return request.g.current_user
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={
'Authorization': 'Basic foo'}))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={'Authorization': 'invalid'}))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={
'Authorization': 'Bearer foo'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'user')
def test_token_optional_auth(self):
app = Microdot()
token_auth = TokenAuth()
@token_auth.authenticate
def authenticate(request, token):
if token == 'foo':
return 'user'
@app.route('/')
@token_auth.optional
def index(request):
return request.g.current_user or ''
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, '')
res = self._run(client.get('/', headers={
'Authorization': 'Basic foo'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, '')
res = self._run(client.get('/', headers={'Authorization': 'foo'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, '')
res = self._run(client.get('/', headers={
'Authorization': 'Bearer foo'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'user')
def test_token_auth_custom_header(self):
app = Microdot()
token_auth = TokenAuth(header='X-Auth-Token')
@token_auth.authenticate
def authenticate(request, token):
if token == 'foo':
return 'user'
@app.route('/')
@token_auth
def index(request):
return request.g.current_user
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={
'Authorization': 'Basic foo'}))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={'Authorization': 'foo'}))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={
'Authorization': 'Bearer foo'}))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={
'X-Token-Auth': 'Bearer foo'}))
self.assertEqual(res.status_code, 401)
res = self._run(client.get('/', headers={'X-Auth-Token': 'foo'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'user')
res = self._run(client.get('/', headers={'x-auth-token': 'foo'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'user')
@token_auth.errorhandler
def error_handler(request):
return {'status_code': 403}, 403
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 403)
self.assertEqual(res.json, {'status_code': 403})

188
tests/test_login.py Normal file
View File

@@ -0,0 +1,188 @@
import asyncio
import unittest
from microdot import Microdot
from microdot.login import Login
from microdot.session import Session
from microdot.test_client import TestClient
class TestLogin(unittest.TestCase):
@classmethod
def setUpClass(cls):
if hasattr(asyncio, 'set_event_loop'):
asyncio.set_event_loop(asyncio.new_event_loop())
cls.loop = asyncio.get_event_loop()
def _run(self, coro):
return self.loop.run_until_complete(coro)
def test_login(self):
app = Microdot()
Session(app, secret_key='secret')
login = Login()
class User:
def __init__(self, id, name):
self.id = id
self.name = name
@login.user_loader
def load_user(user_id):
return User(user_id, f'user{user_id}')
@app.get('/')
@login
def index(request):
return request.g.current_user.name
@app.post('/login')
async def login_route(request):
return await login.login_user(request, User(123, 'user123'))
@app.post('/logout')
async def logout_route(request):
await login.logout_user(request)
return 'ok'
client = TestClient(app)
res = self._run(client.get('/?foo=bar'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/login?next=/%3Ffoo%3Dbar')
res = self._run(client.post('/login?next=/%3Ffoo=bar'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/?foo=bar')
self.assertEqual(len(res.headers['Set-Cookie']), 1)
self.assertIn('session', client.cookies)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'user123')
res = self._run(client.post('/logout'))
self.assertEqual(res.status_code, 200)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 302)
def test_login_bad_user_id(self):
class User:
def __init__(self, id, name):
self.id = id
self.name = name
app = Microdot()
Session(app, secret_key='secret')
login = Login()
@login.user_loader
def load_user(user_id):
return None
@app.get('/foo')
@login
async def index(request):
return 'ok'
@app.post('/login')
async def login_route(request):
return await login.login_user(request, User(1, 'user'))
client = TestClient(app)
res = self._run(client.post('/login?next=/'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/')
res = self._run(client.get('/foo'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/login?next=/foo')
def test_login_bad_redirect(self):
class User:
def __init__(self, id, name):
self.id = id
self.name = name
app = Microdot()
Session(app, secret_key='secret')
login = Login()
@login.user_loader
def load_user(user_id):
return user_id
@app.get('/')
@login
async def index(request):
return 'ok'
@app.post('/login')
async def login_route(request):
return await login.login_user(request, User(1, 'user'))
client = TestClient(app)
res = self._run(client.post('/login?next=http://example.com'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/')
def test_login_remember(self):
class User:
def __init__(self, id, name):
self.id = id
self.name = name
app = Microdot()
Session(app, secret_key='secret')
login = Login()
@login.user_loader
def load_user(user_id):
return User(user_id, f'user{user_id}')
@app.get('/')
@login
def index(request):
return {'user': request.g.current_user.id}
@app.post('/login')
async def login_route(request):
return await login.login_user(request, User(1, 'user1'),
remember=True)
@app.post('/logout')
async def logout(request):
await login.logout_user(request)
return 'ok'
@app.get('/fresh')
@login.fresh
async def fresh(request):
return f'fresh {request.g.current_user.id}'
client = TestClient(app)
res = self._run(client.post('/login?next=/%3Ffoo=bar'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/?foo=bar')
self.assertEqual(len(res.headers['Set-Cookie']), 2)
self.assertIn('session', client.cookies)
self.assertIn('_remember', client.cookies)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, '{"user": 1}')
res = self._run(client.get('/fresh'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'fresh 1')
del client.cookies['session']
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
res = self._run(client.get('/fresh'))
self.assertEqual(res.status_code, 302)
self.assertEqual(res.headers['Location'], '/login?next=/fresh')
res = self._run(client.post('/logout'))
self.assertEqual(res.status_code, 200)
self.assertFalse('_remember' in client.cookies)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 302)

View File

@@ -25,6 +25,14 @@ class TestMicrodot(unittest.TestCase):
async def index2(req):
return 'foo-async'
@app.route('/arg/<id>')
def index3(req, id):
return id
@app.route('/arg/async/<id>')
async def index4(req, id):
return f'async-{id}'
client = TestClient(app)
res = self._run(client.get('/'))
@@ -45,6 +53,24 @@ class TestMicrodot(unittest.TestCase):
self.assertEqual(res.body, b'foo-async')
self.assertEqual(res.json, None)
res = self._run(client.get('/arg/123'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'],
'text/plain; charset=UTF-8')
self.assertEqual(res.headers['Content-Length'], '3')
self.assertEqual(res.text, '123')
self.assertEqual(res.body, b'123')
self.assertEqual(res.json, None)
res = self._run(client.get('/arg/async/123'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'],
'text/plain; charset=UTF-8')
self.assertEqual(res.headers['Content-Length'], '9')
self.assertEqual(res.text, 'async-123')
self.assertEqual(res.body, b'async-123')
self.assertEqual(res.json, None)
def test_post_request(self):
app = Microdot()
@@ -177,6 +203,7 @@ class TestMicrodot(unittest.TestCase):
req.cookies['one'] + req.cookies['two'] + req.cookies['three'])
res.set_cookie('four', '4')
res.delete_cookie('two', path='/')
res.delete_cookie('one', path='/bad')
return res
client = TestClient(app, cookies={'one': '1', 'two': '2'})
@@ -247,6 +274,14 @@ class TestMicrodot(unittest.TestCase):
return '<p>four</p>', 202, \
{'Content-Type': 'text/html; charset=UTF-8'}
@app.route('/status')
def five(req):
return 202
@app.route('/status-headers')
def six(req):
return 202, {'Content-Type': 'text/html; charset=UTF-8'}
client = TestClient(app)
res = self._run(client.get('/body'))
@@ -272,6 +307,18 @@ class TestMicrodot(unittest.TestCase):
'text/html; charset=UTF-8')
self.assertEqual(res.text, '<p>four</p>')
res = self._run(client.get('/status'))
self.assertEqual(res.text, '')
self.assertEqual(res.status_code, 202)
self.assertEqual(res.headers['Content-Type'],
'text/plain; charset=UTF-8')
res = self._run(client.get('/status-headers'))
self.assertEqual(res.text, '')
self.assertEqual(res.status_code, 202)
self.assertEqual(res.headers['Content-Type'],
'text/html; charset=UTF-8')
def test_before_after_request(self):
app = Microdot()
@@ -724,7 +771,7 @@ class TestMicrodot(unittest.TestCase):
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res, None)
self.assertEqual(res.body, None)
def test_mount(self):
subapp = Microdot()
@@ -747,7 +794,7 @@ class TestMicrodot(unittest.TestCase):
@subapp.route('/app')
def index(req):
return req.g.before + ':foo'
return req.g.before + ':' + req.url_prefix
app = Microdot()
app.mount(subapp, url_prefix='/sub')
@@ -764,4 +811,203 @@ class TestMicrodot(unittest.TestCase):
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'],
'text/plain; charset=UTF-8')
self.assertEqual(res.text, 'before:foo:after')
self.assertEqual(res.text, 'before:/sub:after')
def test_mount_local(self):
subapp1 = Microdot()
subapp2 = Microdot()
@subapp1.before_request
def before1(req):
req.g.before += ':before1'
@subapp1.after_error_request
def after_error1(req, res):
res.body += b':errorafter'
@subapp1.errorhandler(ValueError)
def value_error(req, exc):
return str(exc), 400
@subapp1.route('/')
def index1(req):
raise ZeroDivisionError()
@subapp1.route('/foo')
def foo(req):
return req.g.before + ':foo:' + req.url_prefix
@subapp1.route('/err')
def err(req):
raise ValueError('err')
@subapp1.route('/err2')
def err2(req):
class MyErr(ValueError):
pass
raise MyErr('err')
@subapp2.before_request
def before2(req):
req.g.before += ':before2'
@subapp2.after_request
def after2(req, res):
res.body += b':after'
@subapp2.errorhandler(405)
def method_not_found2(req):
return '405', 405
@subapp2.route('/bar')
def bar(req):
return req.g.before + ':bar:' + req.url_prefix
@subapp2.route('/baz')
def baz(req):
abort(405)
app = Microdot()
@app.before_request
def before(req):
req.g.before = 'before-app'
@app.after_request
def after(req, res):
res.body += b':after-app'
app.mount(subapp1, local=True)
app.mount(subapp2, url_prefix='/sub', local=True)
client = TestClient(app)
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 500)
self.assertEqual(res.text, 'Internal server error:errorafter')
res = self._run(client.get('/foo'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'],
'text/plain; charset=UTF-8')
self.assertEqual(res.text, 'before-app:before1:foo::after-app')
res = self._run(client.get('/err'))
self.assertEqual(res.status_code, 400)
self.assertEqual(res.text, 'err:errorafter')
res = self._run(client.get('/err2'))
self.assertEqual(res.status_code, 400)
self.assertEqual(res.text, 'err:errorafter')
res = self._run(client.get('/sub/bar'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'],
'text/plain; charset=UTF-8')
self.assertEqual(res.text,
'before-app:before2:bar:/sub:after:after-app')
res = self._run(client.post('/sub/bar'))
self.assertEqual(res.status_code, 405)
self.assertEqual(res.text, '405')
res = self._run(client.get('/sub/baz'))
self.assertEqual(res.status_code, 405)
self.assertEqual(res.text, '405')
def test_many_mounts(self):
subsubapp = Microdot()
@subsubapp.before_request
def subsubapp_before(req):
req.g.before = 'subsubapp'
@subsubapp.route('/')
def subsubapp_index(req):
return f'{req.g.before}:{req.subapp == subsubapp}:{req.url_prefix}'
subapp = Microdot()
@subapp.before_request
def subapp_before(req):
req.g.before = 'subapp'
@subapp.route('/')
def subapp_index(req):
return f'{req.g.before}:{req.subapp == subapp}:{req.url_prefix}'
app = Microdot()
@app.before_request
def app_before(req):
req.g.before = 'app'
@app.route('/')
def app_index(req):
return f'{req.g.before}:{req.subapp is None}:{req.url_prefix}'
subapp.mount(subsubapp, url_prefix='/subsub')
app.mount(subapp, url_prefix='/sub')
client = TestClient(app)
res = self._run(client.get('/sub/subsub/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'subsubapp:True:/sub/subsub')
res = self._run(client.get('/sub/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'subsubapp:True:/sub')
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'subsubapp:True:')
def test_many_local_mounts(self):
subsubapp = Microdot()
@subsubapp.before_request
def subsubapp_before(req):
req.g.before = 'subsubapp'
@subsubapp.route('/')
def subsubapp_index(req):
return f'{req.g.before}:{req.subapp == subsubapp}:{req.url_prefix}'
subapp = Microdot()
@subapp.before_request
def subapp_before(req):
req.g.before = 'subapp'
@subapp.route('/')
def subapp_index(req):
return f'{req.g.before}:{req.subapp == subapp}:{req.url_prefix}'
app = Microdot()
@app.before_request
def app_before(req):
req.g.before = 'app'
@app.route('/')
def app_index(req):
return f'{req.g.before}:{req.subapp is None}:{req.url_prefix}'
subapp.mount(subsubapp, url_prefix='/subsub', local=True)
app.mount(subapp, url_prefix='/sub', local=True)
client = TestClient(app)
res = self._run(client.get('/sub/subsub/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'subsubapp:True:/sub/subsub')
res = self._run(client.get('/sub/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'subapp:True:/sub')
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'app:True:')

192
tests/test_multipart.py Normal file
View File

@@ -0,0 +1,192 @@
import asyncio
import os
import unittest
from microdot import Microdot
from microdot.multipart import with_form_data, FileUpload, FormDataIter
from microdot.test_client import TestClient
class TestMultipart(unittest.TestCase):
@classmethod
def setUpClass(cls):
if hasattr(asyncio, 'set_event_loop'):
asyncio.set_event_loop(asyncio.new_event_loop())
cls.loop = asyncio.get_event_loop()
def _run(self, coro):
return self.loop.run_until_complete(coro)
def test_simple_form(self):
app = Microdot()
@app.post('/sync')
@with_form_data
def sync_route(req):
return dict(req.form)
@app.post('/async')
@with_form_data
async def async_route(req):
return dict(req.form)
client = TestClient(app)
res = self._run(client.post(
'/sync', headers={
'Content-Type': 'multipart/form-data; boundary=boundary',
},
body=(
b'--boundary\r\n'
b'Content-Disposition: form-data; name="foo"\r\n\r\nbar\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="baz"\r\n\r\nbaz\r\n'
b'--boundary--\r\n')
))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.json, {'foo': 'bar', 'baz': 'baz'})
res = self._run(client.post(
'/async', headers={
'Content-Type': 'multipart/form-data; boundary=boundary',
},
body=(
b'--boundary\r\n'
b'Content-Disposition: form-data; name="foo"\r\n\r\nbar\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="baz"\r\n\r\nbaz\r\n'
b'--boundary--\r\n')
))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.json, {'foo': 'bar', 'baz': 'baz'})
def test_form_with_files(self):
saved_max_memory_size = FileUpload.max_memory_size
FileUpload.max_memory_size = 5
app = Microdot()
@app.post('/async')
@with_form_data
async def async_route(req):
d = dict(req.form)
for name, file in req.files.items():
d[name] = '{}|{}|{}'.format(file.filename, file.content_type,
(await file.read()).decode())
return d
client = TestClient(app)
res = self._run(client.post(
'/async', headers={
'Content-Type': 'multipart/form-data; boundary=boundary',
},
body=(
b'--boundary\r\n'
b'Content-Disposition: form-data; name="foo"\r\n\r\nbar\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="f"; filename="f"\r\n'
b'Content-Type: text/plain\r\n\r\nbaz\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="g"; filename="g"\r\n'
b'Content-Type: text/html\r\n\r\n<p>hello</p>\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="x"\r\n\r\ny\r\n'
b'--boundary--\r\n')
))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.json, {'foo': 'bar', 'x': 'y',
'f': 'f|text/plain|baz',
'g': 'g|text/html|<p>hello</p>'})
FileUpload.max_memory_size = saved_max_memory_size
def test_file_save(self):
app = Microdot()
@app.post('/async')
@with_form_data
async def async_route(req):
for _, file in req.files.items():
await file.save('_x.txt')
client = TestClient(app)
res = self._run(client.post(
'/async', headers={
'Content-Type': 'multipart/form-data; boundary=boundary',
},
body=(
b'--boundary\r\n'
b'Content-Disposition: form-data; name="foo"\r\n\r\nbar\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="f"; filename="f"\r\n'
b'Content-Type: text/plain\r\n\r\nbaz\r\n'
b'--boundary--\r\n')
))
self.assertEqual(res.status_code, 204)
with open('_x.txt', 'rb') as f:
self.assertEqual(f.read(), b'baz')
os.unlink('_x.txt')
def test_no_form(self):
app = Microdot()
@app.post('/async')
@with_form_data
async def async_route(req):
return str(req.form)
client = TestClient(app)
res = self._run(client.post('/async', body={'foo': 'bar'}))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.text, 'None')
def test_upload_iterator(self):
app = Microdot()
@app.post('/async')
async def async_route(req):
d = {}
async for name, value in FormDataIter(req):
if isinstance(value, FileUpload):
d[name] = '{}|{}|{}'.format(value.filename,
value.content_type,
(await value.read(4)).decode())
else:
d[name] = value
return d
client = TestClient(app)
res = self._run(client.post(
'/async', headers={
'Content-Type': 'multipart/form-data; boundary=boundary',
},
body=(
b'--boundary\r\n'
b'Content-Disposition: form-data; name="foo"\r\n\r\nbar\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="f"; filename="f"\r\n'
b'Content-Type: text/plain\r\n\r\nbaz\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="g"; filename="g.h"\r\n'
b'Content-Type: text/html\r\n\r\n<p>hello</p>\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="x"\r\n\r\ny\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="h"; filename="hh"\r\n'
b'Content-Type: text/plain\r\n\r\nyy' + (b'z' * 500) + b'\r\n'
b'--boundary\r\n'
b'Content-Disposition: form-data; name="i"; filename="i.1"\r\n'
b'Content-Type: text/plain\r\n\r\n1234\r\n'
b'--boundary--\r\n')
))
self.assertEqual(res.status_code, 200)
self.assertEqual(res.json, {
'foo': 'bar',
'f': 'f|text/plain|baz',
'g': 'g.h|text/html|<p>h',
'x': 'y',
'h': 'hh|text/plain|yyzz',
'i': 'i.1|text/plain|1234',
})

View File

@@ -1,5 +1,4 @@
import asyncio
from datetime import datetime
import unittest
from microdot import Response
from tests.mock_socket import FakeStreamAsync
@@ -137,10 +136,10 @@ class TestResponse(unittest.TestCase):
self.assertTrue(fd.response.endswith(b'\r\n\r\nfoobar'))
def test_create_from_other(self):
res = Response(123)
res = Response(23.7)
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers, {})
self.assertEqual(res.body, 123)
self.assertEqual(res.body, 23.7)
def test_create_with_status_code(self):
res = Response('not found', 404)
@@ -186,14 +185,15 @@ class TestResponse(unittest.TestCase):
res.set_cookie('foo2', 'bar2', path='/', partitioned=True)
res.set_cookie('foo3', 'bar3', domain='example.com:1234')
res.set_cookie('foo4', 'bar4',
expires=datetime(2019, 11, 5, 2, 23, 54))
expires='Tue, 05 Nov 2019 02:23:54 GMT')
res.set_cookie('foo5', 'bar5', max_age=123,
expires='Thu, 01 Jan 1970 00:00:00 GMT')
res.set_cookie('foo6', 'bar6', secure=True, http_only=True)
res.set_cookie('foo7', 'bar7', path='/foo', domain='example.com:1234',
expires=datetime(2019, 11, 5, 2, 23, 54), max_age=123,
expires='Tue, 05 Nov 2019 02:23:54 GMT', max_age=123,
secure=True, http_only=True)
res.delete_cookie('foo8', http_only=True)
res.delete_cookie('foo9', path='/s')
self.assertEqual(res.headers, {'Set-Cookie': [
'foo1=bar1',
'foo2=bar2; Path=/; Partitioned',
@@ -204,7 +204,10 @@ class TestResponse(unittest.TestCase):
'foo7=bar7; Path=/foo; Domain=example.com:1234; '
'Expires=Tue, 05 Nov 2019 02:23:54 GMT; Max-Age=123; Secure; '
'HttpOnly',
'foo8=; Expires=Thu, 01 Jan 1970 00:00:01 GMT; HttpOnly',
('foo8=; Expires=Thu, 01 Jan 1970 00:00:01 GMT; Max-Age=0; '
'HttpOnly'),
('foo9=; Path=/s; Expires=Thu, 01 Jan 1970 00:00:01 GMT; '
'Max-Age=0'),
]})
def test_redirect(self):
@@ -277,6 +280,17 @@ class TestResponse(unittest.TestCase):
'application/octet-stream')
self.assertEqual(res.headers['Content-Encoding'], 'gzip')
def test_send_file_gzip_handling(self):
res = Response.send_file('tests/files/test.txt.gz')
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'],
'application/octet-stream')
res = Response.send_file('tests/files/test.txt.gz', compressed=True)
self.assertEqual(res.status_code, 200)
self.assertEqual(res.headers['Content-Type'], 'text/plain')
self.assertEqual(res.headers['Content-Encoding'], 'gzip')
def test_default_content_type(self):
original_content_type = Response.default_content_type
res = Response('foo')

View File

@@ -37,7 +37,7 @@ class TestSession(unittest.TestCase):
@app.post('/set')
@with_session
async def save_session(req, session):
def save_session(req, session):
session['name'] = 'joe'
session.save()
return 'OK'
@@ -82,3 +82,77 @@ class TestSession(unittest.TestCase):
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
def test_session_default_path(self):
app = Microdot()
Session(app, secret_key='some-other-secret')
client = TestClient(app)
@app.get('/')
@with_session
def index(req, session):
session['foo'] = 'bar'
session.save()
return ''
@app.get('/child')
@with_session
def child(req, session):
return str(session.get('foo'))
@app.get('/delete')
@with_session
def delete(req, session):
session.delete()
return ''
res = self._run(client.get('/'))
self.assertEqual(res.status_code, 200)
res = self._run(client.get('/child'))
self.assertEqual(res.text, 'bar')
res = self._run(client.get('/delete'))
res = self._run(client.get('/child'))
self.assertEqual(res.text, 'None')
def test_session_custom_path(self):
app = Microdot()
session_ext = Session()
session_ext.initialize(app, secret_key='some-other-secret',
cookie_options={'path': '/child',
'http_only': False})
client = TestClient(app)
@app.get('/')
@with_session
def index(req, session):
return str(session.get('foo'))
@app.get('/child')
@with_session
def child(req, session):
session['foo'] = 'bar'
session.save()
return ''
@app.get('/child/foo')
@with_session
def foo(req, session):
return str(session.get('foo'))
@app.get('/child/delete')
@with_session
def delete(req, session):
session.delete()
return ''
res = self._run(client.get('/child'))
self.assertEqual(res.status_code, 200)
res = self._run(client.get('/'))
self.assertEqual(res.text, 'None')
res = self._run(client.get('/child/foo'))
self.assertEqual(res.text, 'bar')
res = self._run(client.get('/child/delete'))
res = self._run(client.get('/'))
self.assertEqual(res.text, 'None')
res = self._run(client.get('/child/foo'))
self.assertEqual(res.text, 'None')

View File

@@ -23,9 +23,12 @@ class TestWebSocket(unittest.TestCase):
async def handle_sse(request, sse):
await sse.send('foo')
await sse.send('bar', event='test')
await sse.send('bar', event='test', event_id='id42')
await sse.send('bar', event_id='id42')
await sse.send({'foo': 'bar'})
await sse.send([42, 'foo', 'bar'])
await sse.send(ValueError('foo'))
await sse.send(b'foo')
client = TestClient(app)
response = self._run(client.get('/sse'))
@@ -33,6 +36,46 @@ class TestWebSocket(unittest.TestCase):
self.assertEqual(response.headers['Content-Type'], 'text/event-stream')
self.assertEqual(response.text, ('data: foo\n\n'
'event: test\ndata: bar\n\n'
'event: test\nid: id42\ndata: bar\n\n'
'id: id42\ndata: bar\n\n'
'data: {"foo": "bar"}\n\n'
'data: [42, "foo", "bar"]\n\n'
'data: foo\n\n'
'data: foo\n\n'))
self.assertEqual(len(response.events), 8)
self.assertEqual(response.events[0], {
'data': b'foo', 'data_json': None, 'event': None,
'event_id': None})
self.assertEqual(response.events[1], {
'data': b'bar', 'data_json': None, 'event': 'test',
'event_id': None})
self.assertEqual(response.events[2], {
'data': b'bar', 'data_json': None, 'event': 'test',
'event_id': 'id42'})
self.assertEqual(response.events[3], {
'data': b'bar', 'data_json': None, 'event': None,
'event_id': 'id42'})
self.assertEqual(response.events[4], {
'data': b'{"foo": "bar"}', 'data_json': {'foo': 'bar'},
'event': None, 'event_id': None})
self.assertEqual(response.events[5], {
'data': b'[42, "foo", "bar"]', 'data_json': [42, 'foo', 'bar'],
'event': None, 'event_id': None})
self.assertEqual(response.events[6], {
'data': b'foo', 'data_json': None, 'event': None,
'event_id': None})
self.assertEqual(response.events[7], {
'data': b'foo', 'data_json': None, 'event': None,
'event_id': None})
def test_sse_exception(self):
app = Microdot()
@app.route('/sse')
@with_sse
async def handle_sse(request, sse):
await sse.send('foo')
await sse.send(1 / 0)
client = TestClient(app)
self.assertRaises(ZeroDivisionError, self._run, client.get('/sse'))

View File

@@ -7,11 +7,14 @@ class TestURLPattern(unittest.TestCase):
p = URLPattern('/')
self.assertEqual(p.match('/'), {})
self.assertIsNone(p.match('/foo'))
self.assertIsNone(p.match('foo'))
self.assertIsNone(p.match(''))
p = URLPattern('/foo/bar')
self.assertEqual(p.match('/foo/bar'), {})
self.assertIsNone(p.match('/foo'))
self.assertIsNone(p.match('/foo/bar/'))
self.assertIsNone(p.match('/foo/bar/baz'))
p = URLPattern('/foo//bar/baz/')
self.assertEqual(p.match('/foo//bar/baz/'), {})
@@ -23,32 +26,50 @@ class TestURLPattern(unittest.TestCase):
p = URLPattern('/<arg>')
self.assertEqual(p.match('/foo'), {'arg': 'foo'})
self.assertIsNone(p.match('/'))
self.assertIsNone(p.match('//'))
self.assertIsNone(p.match(''))
self.assertIsNone(p.match('foo/'))
self.assertIsNone(p.match('/foo/'))
self.assertIsNone(p.match('//foo/'))
self.assertIsNone(p.match('/foo//'))
self.assertIsNone(p.match('/foo/bar'))
self.assertIsNone(p.match('/foo//bar'))
p = URLPattern('/<arg>/')
self.assertEqual(p.match('/foo/'), {'arg': 'foo'})
self.assertIsNone(p.match('/'))
self.assertIsNone(p.match('/foo'))
self.assertIsNone(p.match('/foo/bar'))
self.assertIsNone(p.match('/foo/bar/'))
p = URLPattern('/<string:arg>')
self.assertEqual(p.match('/foo'), {'arg': 'foo'})
self.assertIsNone(p.match('/'))
self.assertIsNone(p.match('/foo/'))
self.assertIsNone(p.match('/foo/bar'))
self.assertIsNone(p.match('/foo/bar/'))
p = URLPattern('/<string:arg>/')
self.assertEqual(p.match('/foo/'), {'arg': 'foo'})
self.assertIsNone(p.match('/'))
self.assertIsNone(p.match('/foo'))
self.assertIsNone(p.match('/foo/bar'))
self.assertIsNone(p.match('/foo/bar/'))
p = URLPattern('/foo/<arg1>/bar/<arg2>')
self.assertEqual(p.match('/foo/one/bar/two'),
{'arg1': 'one', 'arg2': 'two'})
self.assertIsNone(p.match('/'))
self.assertIsNone(p.match('/foo/'))
self.assertIsNone(p.match('/foo/bar'))
self.assertIsNone(p.match('/foo//bar/'))
self.assertIsNone(p.match('/foo//bar//'))
def test_int_argument(self):
p = URLPattern('/users/<int:id>')
self.assertEqual(p.match('/users/123'), {'id': 123})
self.assertEqual(p.match('/users/-123'), {'id': -123})
self.assertEqual(p.match('/users/0'), {'id': 0})
self.assertIsNone(p.match('/users/'))
self.assertIsNone(p.match('/users/abc'))
self.assertIsNone(p.match('/users/123abc'))
@@ -82,7 +103,10 @@ class TestURLPattern(unittest.TestCase):
p = URLPattern('/users/<re:[a-c]+:id>')
self.assertEqual(p.match('/users/ab'), {'id': 'ab'})
self.assertEqual(p.match('/users/bca'), {'id': 'bca'})
self.assertIsNone(p.match('/users'))
self.assertIsNone(p.match('/users/'))
self.assertIsNone(p.match('/users/abcd'))
self.assertIsNone(p.match('/users/abcdx'))
def test_many_arguments(self):
p = URLPattern('/foo/<path:path>/<int:id>/bar/<name>')
@@ -95,5 +119,30 @@ class TestURLPattern(unittest.TestCase):
self.assertIsNone(p.match('/foo/abc/def/123/test'))
def test_invalid_url_patterns(self):
self.assertRaises(ValueError, URLPattern, '/users/<foo/bar')
self.assertRaises(ValueError, URLPattern, '/users/<badtype:id>')
p = URLPattern('/users/<foo/bar')
self.assertRaises(ValueError, p.compile)
p = URLPattern('/users/<badtype:id>')
self.assertRaises(ValueError, p.compile)
def test_custom_url_pattern(self):
URLPattern.register_type('hex', '[0-9a-f]+')
p = URLPattern('/users/<hex:id>')
self.assertEqual(p.match('/users/a1'), {'id': 'a1'})
self.assertIsNone(p.match('/users/ab12z'))
URLPattern.register_type('hex', '[0-9a-f]+',
parser=lambda value: int(value, 16))
p = URLPattern('/users/<hex:id>')
self.assertEqual(p.match('/users/a1'), {'id': 161})
self.assertIsNone(p.match('/users/ab12z'))
def hex_parser(value):
try:
return int(value, 16)
except ValueError:
return None
URLPattern.register_type('hex', parser=hex_parser)
p = URLPattern('/users/<hex:id>')
self.assertEqual(p.match('/users/a1'), {'id': 161})
self.assertIsNone(p.match('/users/ab12z'))

View File

@@ -1,5 +1,5 @@
import unittest
from microdot.microdot import urlencode, urldecode_str, urldecode_bytes
from microdot.microdot import urlencode, urldecode
class TestURLEncode(unittest.TestCase):
@@ -7,5 +7,7 @@ class TestURLEncode(unittest.TestCase):
self.assertEqual(urlencode('?foo=bar&x'), '%3Ffoo%3Dbar%26x')
def test_urldecode(self):
self.assertEqual(urldecode_str('%3Ffoo%3Dbar%26x'), '?foo=bar&x')
self.assertEqual(urldecode_bytes(b'%3Ffoo%3Dbar%26x'), '?foo=bar&x')
self.assertEqual(urldecode('%3Ffoo%3Dbar%26x'), '?foo=bar&x')
self.assertEqual(urldecode(b'%3Ffoo%3Dbar%26x'), '?foo=bar&x')
self.assertEqual(urldecode('dot%e2%80%a2dot'), 'dot•dot')
self.assertEqual(urldecode(b'dot%e2%80%a2dot'), 'dot•dot')

View File

@@ -1,8 +1,8 @@
import asyncio
import sys
import unittest
from microdot import Microdot
from microdot.websocket import with_websocket, WebSocket
from microdot import Microdot, Request
from microdot.websocket import with_websocket, WebSocket, WebSocketError
from microdot.test_client import TestClient
@@ -17,6 +17,7 @@ class TestWebSocket(unittest.TestCase):
return self.loop.run_until_complete(coro)
def test_websocket_echo(self):
WebSocket.max_message_length = 65537
app = Microdot()
@app.route('/echo')
@@ -26,34 +27,10 @@ class TestWebSocket(unittest.TestCase):
data = await ws.receive()
await ws.send(data)
results = []
def ws():
data = yield 'hello'
results.append(data)
data = yield b'bye'
results.append(data)
data = yield b'*' * 300
results.append(data)
data = yield b'+' * 65537
results.append(data)
client = TestClient(app)
res = self._run(client.websocket('/echo', ws))
self.assertIsNone(res)
self.assertEqual(results, ['hello', b'bye', b'*' * 300, b'+' * 65537])
@unittest.skipIf(sys.implementation.name == 'micropython',
'no support for async generators in MicroPython')
def test_websocket_echo_async_client(self):
app = Microdot()
@app.route('/echo')
@app.route('/divzero')
@with_websocket
async def index(req, ws):
while True:
data = await ws.receive()
await ws.send(data)
async def divzero(req, ws):
1 / 0
results = []
@@ -69,9 +46,38 @@ class TestWebSocket(unittest.TestCase):
client = TestClient(app)
res = self._run(client.websocket('/echo', ws))
self.assertIsNone(res)
self.assertIsNone(res.body)
self.assertEqual(results, ['hello', b'bye', b'*' * 300, b'+' * 65537])
res = self._run(client.websocket('/divzero', ws))
self.assertIsNone(res.body)
WebSocket.max_message_length = -1
@unittest.skipIf(sys.implementation.name == 'micropython',
'no support for async generators in MicroPython')
def test_websocket_large_message(self):
saved_max_body_length = Request.max_body_length
Request.max_body_length = 10
app = Microdot()
@app.route('/echo')
@with_websocket
async def index(req, ws):
data = await ws.receive()
await ws.send(data)
results = []
async def ws():
data = yield '0123456789abcdef'
results.append(data)
client = TestClient(app)
res = self._run(client.websocket('/echo', ws))
self.assertIsNone(res.body)
self.assertEqual(results, [])
Request.max_body_length = saved_max_body_length
def test_bad_websocket_request(self):
app = Microdot()
@@ -106,7 +112,7 @@ class TestWebSocket(unittest.TestCase):
(None, 'foo'))
self.assertEqual(ws._process_websocket_frame(WebSocket.BINARY, b'foo'),
(None, b'foo'))
self.assertRaises(OSError, ws._process_websocket_frame,
self.assertRaises(WebSocketError, ws._process_websocket_frame,
WebSocket.CLOSE, b'')
self.assertEqual(ws._process_websocket_frame(WebSocket.PING, b'foo'),
(WebSocket.PONG, b'foo'))

View File

@@ -1,23 +1,24 @@
FROM ubuntu:22.04
ARG DEBIAN_FRONTEND=noninteractive
ARG VERSION=master
ENV VERSION=$VERSION
RUN apt-get update && \
apt-get install -y build-essential libffi-dev git pkg-config python3 && \
rm -rf /var/lib/apt/lists/* && \
git clone https://github.com/micropython/micropython.git && \
cd micropython && \
git checkout $VERSION && \
git submodule update --init && \
cd mpy-cross && \
make && \
cd .. && \
cd ports/unix && \
make && \
make test && \
make install && \
apt-get purge --auto-remove -y build-essential libffi-dev git pkg-config python3 && \
cd ../../.. && \
rm -rf micropython
CMD ["/usr/local/bin/micropython"]

View File

@@ -0,0 +1,24 @@
FROM ubuntu:22.04
ARG DEBIAN_FRONTEND=noninteractive
ARG VERSION=main
ENV VERSION=$VERSION
RUN apt-get update && \
apt-get install -y build-essential libffi-dev git pkg-config python3 && \
rm -rf /var/lib/apt/lists/* && \
git clone https://github.com/adafruit/circuitpython.git && \
cd circuitpython && \
git checkout $VERSION && \
git submodule update --init lib tools frozen && \
cd mpy-cross && \
make && \
cd .. && \
cd ports/unix && \
make && \
make install && \
apt-get purge --auto-remove -y build-essential libffi-dev git pkg-config python3 && \
cd ../../.. && \
rm -rf circuitpython
CMD ["/usr/local/bin/micropython"]

11
tools/update-circuitpython.sh Executable file
View File

@@ -0,0 +1,11 @@
#!/bin/bash
# this script updates the micropython binary in the /bin directory that is
# used to run unit tests under GitHub Actions builds
DOCKER=${DOCKER:-docker}
VERSION=${1:-main}
$DOCKER build -f Dockerfile.circuitpython --build-arg VERSION=$VERSION -t circuitpython .
$DOCKER create -t --name dummy-circuitpython circuitpython
$DOCKER cp dummy-circuitpython:/usr/local/bin/micropython ../bin/circuitpython
$DOCKER rm dummy-circuitpython

View File

@@ -3,8 +3,9 @@
# used to run unit tests under GitHub Actions builds
DOCKER=${DOCKER:-docker}
VERSION=${1:-master}
$DOCKER build -t micropython .
$DOCKER build --build-arg VERSION=$VERSION -t micropython .
$DOCKER create -it --name dummy-micropython micropython
$DOCKER cp dummy-micropython:/usr/local/bin/micropython ../bin/micropython
$DOCKER rm dummy-micropython

17
tox.ini
View File

@@ -1,5 +1,5 @@
[tox]
envlist=flake8,py38,py39,py310,py311,py312,upy,benchmark
envlist=flake8,py38,py39,py310,py311,py312,upy,cpy,benchmark,docs
skipsdist=True
skip_missing_interpreters=True
@@ -29,10 +29,13 @@ setenv=
allowlist_externals=sh
commands=sh -c "bin/micropython run_tests.py"
[testenv:cpy]
allowlist_externals=sh
commands=sh -c "bin/circuitpython run_tests.py"
[testenv:upy-mac]
allowlist_externals=micropython
commands=micropython run_tests.py
deps=
[testenv:benchmark]
deps=
@@ -55,3 +58,13 @@ deps=
flake8
commands=
flake8 --ignore=W503 --exclude examples/templates/utemplate/templates src tests examples
[testenv:docs]
changedir=docs
deps=
sphinx
pyjwt
allowlist_externals=
make
commands=
make html