Add 'rosbags/' from commit 'c80625df279c154c6ec069cbac30faa319755e47'

git-subtree-dir: rosbags
git-subtree-mainline: 48df1fbdf4490f3cbfa3267c998d1a0fc98378ca
git-subtree-split: c80625df279c154c6ec069cbac30faa319755e47
This commit is contained in:
Apoorva Gupta 2023-03-28 18:21:08 +05:30
commit 0c9504b343
99 changed files with 16378 additions and 0 deletions

21
rosbags/.gitignore vendored Normal file
View File

@ -0,0 +1,21 @@
*.py[co]
*.egg-info/
.eggs/
__pycache__/
/build/
/dist/
/public/
/venv/
/.mypy_cache/
/.pytest_cache/
/htmlcov/
/.coverage
/coverage.xml
/report.xml
/tools/messages/[^.]*
*.sw[op]
/.vscode

51
rosbags/.gitlab-ci.yml Normal file
View File

@ -0,0 +1,51 @@
stages:
- test
- build
test:
stage: test
image: python:3.8
script:
- python3.8 -m venv venv
- venv/bin/python -m pip install -r requirements-dev.txt
- venv/bin/python -m pip install -e .[dev]
- venv/bin/pytest --cov-report=term --cov-report=xml --junit-xml=report.xml
- venv/bin/flake8 src tests
- venv/bin/mypy --no-error-summary src tests
- venv/bin/pylint --jobs 0 --score n src tests
- venv/bin/yapf -dpr src tests
- venv/bin/sphinx-build docs public
coverage: '/\d+\%\s*$/'
artifacts:
paths:
- public
reports:
coverage_report:
coverage_format: cobertura
path: coverage.xml
junit: report.xml
build:
stage: build
image: python:3.8
script:
- python3.8 -m venv venv
- venv/bin/python -m pip install build
- venv/bin/python -m build .
artifacts:
paths:
- dist
pages:
stage: build
image: python:3.8
script:
- ls public
artifacts:
paths:
- public
only:
- master

View File

@ -0,0 +1,28 @@
## Your Environment
Thank you for taking the time to report an issue.
To more efficiently resolve this issue, we'd like to know some basic information about your system and setup.
1) Your operating system:
2) Version of python you are running (`python --version`):
3) How did you install rosbags? Did you use pip to install from PyPI or a repository checkout or something else?
4) Version of rosbags you have installed (`pip show rosbags | grep Version`):
If you're having issues with (de)serialization of custom message types please include a copy of the following:
* Message definition files (msg or idl)
* The bytes of an example message
## The Issue
Please describe the issue that you are experiencing.
## Steps to Reproduce
If the issue is predictable and consistently reproducible, please list the steps here.

148
rosbags/CHANGES.rst Normal file
View File

@ -0,0 +1,148 @@
.. _changes:
Changes
=======
0.9.15 - 2023-03-02
-------------------
- Refactor rosbag2 Reader for multipe storage backends
- Improve parsing of IDL files
- Handle bags contaning only connection records
- Add AnyReader to documentation
- Add initial MCAP reader for rosbag2 `#33`_
.. _#33: https://gitlab.com/ternaris/rosbags/issues/33
0.9.14 - 2023-01-12
-------------------
- Fix reader example in README `#40`_
- Flush decompressed files rosbag2.Reader
- Advertise Python 3.11 compatibility
.. _#40: https://gitlab.com/ternaris/rosbags/issues/40
0.9.13 - 2022-09-23
-------------------
- Fix parsing of comments in message definitions `#31`_
- Fix parsing of members starting with ``string`` in message definitions `#35`_
- Change lz4 compression level to 0 `#36`_
- Add include filters to rosbag conversion `#38`_
- Implement direct ros1 (de)serialization
.. _#31: https://gitlab.com/ternaris/rosbags/issues/31
.. _#35: https://gitlab.com/ternaris/rosbags/issues/35
.. _#36: https://gitlab.com/ternaris/rosbags/issues/36
.. _#38: https://gitlab.com/ternaris/rosbags/issues/38
0.9.12 - 2022-07-27
-------------------
- Add support for rosbag2 version 6 metadata `#30`_
- Enable rosbags-convert to exclude topics `#25`_
.. _#30: https://gitlab.com/ternaris/rosbags/issues/30
.. _#25: https://gitlab.com/ternaris/rosbags/issues/25
0.9.11 - 2022-05-17
-------------------
- Report start_time and end_time on empty bags
0.9.10 - 2022-05-04
-------------------
- Add support for multiple type stores
- Document which types are supported out of the box `#21`_
- Unify Connection and TopicInfo objects across rosbag1 and rosbag2
- Add experimental all-in-one reader for rosbag1, split rosbag1, and rosbag2
- Convert reader and writer .connection attribute from dict to list
- Add support for rosbag2 version 5 metadata `#18`_
- Speed up opening of rosbag1 files
- Fix serialization of empty message sequences `#23`_
.. _#18: https://gitlab.com/ternaris/rosbags/issues/18
.. _#21: https://gitlab.com/ternaris/rosbags/issues/21
.. _#23: https://gitlab.com/ternaris/rosbags/issues/23
0.9.9 - 2022-01-10
------------------
- Fix documentation code samples `#15`_
- Fix handling of padding after empty sequences `#14`_
- Support conversion from rosbag2 to rosbag1 `#11`_
.. _#11: https://gitlab.com/ternaris/rosbags/issues/11
.. _#14: https://gitlab.com/ternaris/rosbags/issues/14
.. _#15: https://gitlab.com/ternaris/rosbags/issues/15
0.9.8 - 2021-11-25
------------------
- Support bool and float constants in msg files
0.9.7 - 2021-11-09
------------------
- Fix parsing of const fields with string value `#9`_
- Parse empty msg definitions
- Make packages PEP561 compliant
- Parse msg bounded fields and default values `#12`_
.. _#9: https://gitlab.com/ternaris/rosbags/issues/9
.. _#12: https://gitlab.com/ternaris/rosbags/issues/12
0.9.6 - 2021-10-04
------------------
- Do not match msg separator as constant value
0.9.5 - 2021-10-04
------------------
- Add string constant support to msg parser
0.9.4 - 2021-09-15
------------------
- Make reader1 API match reader2
- Fix connection mapping for reader2 messages `#1`_, `#8`_
.. _#1: https://gitlab.com/ternaris/rosbags/issues/1
.. _#8: https://gitlab.com/ternaris/rosbags/issues/8
0.9.3 - 2021-08-06
------------------
- Add const fields to type classes
- Add CDR to ROS1 bytestream conversion
- Add ROS1 message definiton generator
- Use connection oriented APIs in readers and writers
- Add rosbag1 writer
0.9.2 - 2021-07-08
------------------
- Support relative type references in msg files
0.9.1 - 2021-07-05
------------------
- Use half-open intervals for time ranges
- Create appropriate QoS profiles for latched topics in converted bags
- Fix return value tuple order of messages() in documentation `#2`_
- Add type hints to message classes
- Remove non-default ROS2 message types
- Support multi-line comments in idl files
- Fix parsing of msg files on non-POSIX platforms `#4`_
.. _#2: https://gitlab.com/ternaris/rosbags/issues/2
.. _#4: https://gitlab.com/ternaris/rosbags/issues/4
0.9.0 - 2021-05-16
------------------
- Initial Release

78
rosbags/CONTRIBUTING.rst Normal file
View File

@ -0,0 +1,78 @@
==================
Contribution guide
==================
Thank you for considering to contribute to rosbags. Below is information on how to report issues and submit your contributions to rosbags.
Rights to and license of contributions
======================================
Rosbags is licensed under `Apache 2.0`_. Your submission of an issue, merge request, comment, or code to us is:
1. If your employer has rights in your contributions, your representation that your employer has authorized you to enter into this agreement on its behalf;
2. Your agreement, or your employer's agreement, with the terms and conditions in this document;
3. Your signature of the `Developer Certificate of Origin`_; and
4. Your grant of a license to your contributions under `Apache 2.0`_.
Contributing code / merge requests
==================================
In order to contribute code there are a few noteworthy things:
1. Especially for non-trivial contributions, please **submit an issue first** to discuss your ideas.
2. If your merge requests relates to an existing issue, please reference it from your merge request.
3. When creating a merge request, please `allow collaboration`_. This enables us to make small adjustments and rebase the branch as needed. Please use dedicated branches for your merge request and don't give us access to a branch that is dear to you.
4. Stick to *The seven rules of a great Git commit message* (see below).
5. We require you to **sign-off your commits** (see below). Your sign-off indicates that you agreed to the terms and conditions laid out in this document, if applicable on behalf of your employer.
.. _allow collaboration:
https://docs.gitlab.com/ee/user/project/merge_requests/allow_collaboration.html
The seven rules of a great Git commit message
---------------------------------------------
We like `The seven rules of a great Git commit message`_, summarized here for completeness, follow links for further reading.
1. `Separate subject from body with a blank line <https://chris.beams.io/posts/git-commit/#separate>`_
2. `Limit the subject line to 50 characters <https://chris.beams.io/posts/git-commit/#limit-50>`_ (soft-limit 50, hard-limit 72)
3. `Start subject line with uppercase letter <https://chris.beams.io/posts/git-commit/#capitalize>`_
4. `Do not end the subject line with a period <https://chris.beams.io/posts/git-commit/#end>`_
5. `Use the imperative mood in the subject line <https://chris.beams.io/posts/git-commit/#imperative>`_
6. `Wrap the body at 72 characters <https://chris.beams.io/posts/git-commit/#wrap-72>`_
7. `Use the body to explain what and why vs. how <https://chris.beams.io/posts/git-commit/#why-not-how>`_
.. _The seven rules of a great Git commit message: https://chris.beams.io/posts/git-commit/#seven-rules
Signing off a commit
--------------------
You sign off a commit by adding a line like the following to the bottom of its commit message, separated by an empty line.
::
Signed-off-by: Fullname <email@example.net>
Make sure it reflects your real name and email address. Git does this automatically when using ``git commit -s``.
Except for the licenses granted herein, you reserve all right, title, and interest in and to your contributions.
.. _Apache 2.0: ./LICENSE.txt
.. _Developer Certificate of Origin: https://developercertificate.org/

202
rosbags/LICENSE.txt Normal file
View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

118
rosbags/README.rst Normal file
View File

@ -0,0 +1,118 @@
.. image:: https://gitlab.com/ternaris/rosbags/badges/master/pipeline.svg
:target: https://gitlab.com/ternaris/rosbags/-/commits/master
:alt: pipeline status
.. image:: https://gitlab.com/ternaris/rosbags/badges/master/coverage.svg
:target: https://gitlab.com/ternaris/rosbags/-/commits/master
:alt: coverage report
.. image:: https://img.shields.io/pypi/pyversions/rosbags
:alt: python versions
=======
Rosbags
=======
Rosbags is the **pure python** library for everything rosbag. It contains:
- **highlevel** easy-to-use interfaces,
- **rosbag2** reader and writer,
- **rosbag1** reader and writer,
- **extensible** type system with serializers and deserializers,
- **efficient converter** between rosbag1 and rosbag2,
- and more.
Rosbags does not have any dependencies on the ROS software stacks and can be used on its own or alongside ROS1 or ROS2.
Rosbags was developed for `MARV <https://gitlab.com/ternaris/marv-robotics>`_, which requires a fast, correct, and flexible library to read, manipulate, and write the various rosbag file formats.
Getting started
===============
Rosbags is published on PyPI and does not have any special dependencies. Simply install with pip::
pip install rosbags
Read and deserialize messages from rosbag1 or rosbag2 files:
.. code-block:: python
from pathlib import Path
from rosbags.highlevel import AnyReader
# create reader instance and open for reading
with AnyReader([Path('/home/ros/rosbag_2020_03_24')]) as reader:
connections = [x for x in reader.connections if x.topic == '/imu_raw/Imu']
for connection, timestamp, rawdata in reader.messages(connections=connections):
msg = reader.deserialize(rawdata, connection.msgtype)
print(msg.header.frame_id)
Convert between rosbag versions::
# Convert "foo.bag", result will be "foo/"
rosbags-convert foo.bag
# Convert "bar", result will be "bar.bag"
rosbags-convert bar
# Convert "foo.bag", save the result as "bar"
rosbags-convert foo.bag --dst /path/to/bar
# Convert "bar", save the result as "foo.bag"
rosbags-convert bar --dst /path/to/foo.bag
Documentation
=============
Read the `documentation <https://ternaris.gitlab.io/rosbags/>`_ for further information.
.. end documentation
Contributing
============
Thank you for considering to contribute to rosbags.
To submit issues or create merge requests please follow the instructions provided in the `contribution guide <https://gitlab.com/ternaris/rosbags/-/blob/master/CONTRIBUTING.rst>`_.
By contributing to rosbags you accept and agree to the terms and conditions laid out in there.
Development
===========
Clone the repository and setup your local checkout::
git clone https://gitlab.com/ternaris/rosbags.git
cd rosbags
python -m venv venv
. venv/bin/activate
pip install -r requirements-dev.txt
pip install -e .
This creates a new virtual environment with the necessary python dependencies and installs rosbags in editable mode. The rosbags code base uses pytest as its test runner, run the test suite by simply invoking::
pytest
To build the documentation from its source run sphinx-build::
sphinx-build -a docs public
The entry point to the local documentation build should be available under ``public/index.html``.
Support
=======
Professional support is available from `Ternaris <https://ternaris.com>`_.

View File

@ -0,0 +1,6 @@
rosbags.convert
===============
.. automodule:: rosbags.convert
:members:
:show-inheritance:

View File

@ -0,0 +1,6 @@
rosbags.highlevel
=================
.. automodule:: rosbags.highlevel
:members:
:show-inheritance:

View File

@ -0,0 +1,6 @@
rosbags.rosbag1
===============
.. automodule:: rosbags.rosbag1
:members:
:show-inheritance:

View File

@ -0,0 +1,6 @@
rosbags.rosbag2
===============
.. automodule:: rosbags.rosbag2
:members:
:show-inheritance:

View File

@ -0,0 +1,13 @@
Rosbags namespace
=================
.. toctree::
:maxdepth: 4
rosbags.convert
rosbags.highlevel
rosbags.rosbag1
rosbags.rosbag2
rosbags.serde
rosbags.typesys
rosbags.typesys.types

View File

@ -0,0 +1,6 @@
rosbags.serde
=============
.. automodule:: rosbags.serde
:members:
:show-inheritance:

View File

@ -0,0 +1,6 @@
rosbags.typesys
===============
.. automodule:: rosbags.typesys
:members:
:show-inheritance:

View File

@ -0,0 +1,6 @@
rosbags.typesys.types
=====================
.. automodule:: rosbags.typesys.types
:members:
:show-inheritance:

1
rosbags/docs/changes.rst Normal file
View File

@ -0,0 +1 @@
.. include:: ../CHANGES.rst

34
rosbags/docs/conf.py Normal file
View File

@ -0,0 +1,34 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Sphinx config."""
import typing
# https://github.com/sphinx-doc/sphinx/issues/9243
import sphinx.builders.html as _1
import sphinx.builders.latex as _2
import sphinx.builders.texinfo as _3
import sphinx.builders.text as _4
import sphinx.ext.autodoc as _5
__all__ = ['_1', '_2', '_3', '_4', '_5']
# pylint: disable=invalid-name,redefined-builtin
typing.TYPE_CHECKING = True
project = 'Rosbags'
copyright = '2020-2023, Ternaris'
author = 'Ternaris'
autoapi_python_use_implicit_namespaces = True
autodoc_typehints = 'description'
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.napoleon',
'sphinx_autodoc_typehints',
'sphinx_rtd_theme',
]
html_theme = 'sphinx_rtd_theme'

View File

@ -0,0 +1,44 @@
"""Example: Edit timestamps."""
from __future__ import annotations
from typing import TYPE_CHECKING, cast
from rosbags.interfaces import ConnectionExtRosbag2
from rosbags.rosbag2 import Reader, Writer
from rosbags.serde import deserialize_cdr, serialize_cdr
if TYPE_CHECKING:
from pathlib import Path
def offset_timestamps(src: Path, dst: Path, offset: int) -> None:
"""Offset timestamps.
Args:
src: Source path.
dst: Destination path.
offset: Amount of nanoseconds to offset timestamps.
"""
with Reader(src) as reader, Writer(dst) as writer:
conn_map = {}
for conn in reader.connections:
ext = cast(ConnectionExtRosbag2, conn.ext)
conn_map[conn.id] = writer.add_connection(
conn.topic,
conn.msgtype,
ext.serialization_format,
ext.offered_qos_profiles,
)
for conn, timestamp, data in reader.messages():
# Adjust header timestamps, too
msg = deserialize_cdr(data, conn.msgtype)
if head := getattr(msg, 'header', None):
headstamp = head.stamp.sec * 10**9 + head.stamp.nanosec + offset
head.stamp.sec = headstamp // 10**9
head.stamp.nanosec = headstamp % 10**9
data = serialize_cdr(msg, conn.msgtype)
writer.write(conn_map[conn.id], timestamp + offset, data)

View File

@ -0,0 +1,38 @@
"""Example: Remove topic."""
from __future__ import annotations
from typing import TYPE_CHECKING, cast
from rosbags.interfaces import ConnectionExtRosbag2
from rosbags.rosbag2 import Reader, Writer
if TYPE_CHECKING:
from pathlib import Path
def remove_topic(src: Path, dst: Path, topic: str) -> None:
"""Remove topic from rosbag2.
Args:
src: Source path.
dst: Destination path.
topic: Name of topic to remove.
"""
with Reader(src) as reader, Writer(dst) as writer:
conn_map = {}
for conn in reader.connections:
if conn.topic == topic:
continue
ext = cast(ConnectionExtRosbag2, conn.ext)
conn_map[conn.id] = writer.add_connection(
conn.topic,
conn.msgtype,
ext.serialization_format,
ext.offered_qos_profiles,
)
rconns = [reader.connections[x] for x in conn_map]
for conn, timestamp, data in reader.messages(connections=rconns):
writer.write(conn_map[conn.id], timestamp, data)

View File

@ -0,0 +1,16 @@
Edit rosbags
============
Rosbags does not support opening files in read-write mode, but implicitly enforces copy-on-write semantics. Apart from the mapping of reader to writer connections the process is fairly straightforward.
Remove topic
------------
.. literalinclude:: ./edit_rosbags_remove_topic.py
Edit timestamps
---------------
.. literalinclude:: ./edit_rosbags_edit_timestamps.py

View File

@ -0,0 +1,22 @@
Register custom message types
=============================
Out of the box rosbags only supports the message types that ship with a default ROS2 distribution. If you want to (de)serialize custom messages you need to add them to the type system manually.
From rosbag1
------------
.. literalinclude:: ./register_types_rosbag1.py
From definition string
----------------------
.. literalinclude:: ./register_types_string.py
From multiple files
-------------------
.. literalinclude:: ./register_types_files.py

View File

@ -0,0 +1,35 @@
"""Example: Register types from msg files."""
from pathlib import Path
from rosbags.typesys import get_types_from_msg, register_types
def guess_msgtype(path: Path) -> str:
"""Guess message type name from path."""
name = path.relative_to(path.parents[2]).with_suffix('')
if 'msg' not in name.parts:
name = name.parent / 'msg' / name.name
return str(name)
add_types = {}
for pathstr in [
'/path/to/custom_msgs/msg/Speed.msg',
'/path/to/custom_msgs/msg/Accel.msg',
]:
msgpath = Path(pathstr)
msgdef = msgpath.read_text(encoding='utf-8')
add_types.update(get_types_from_msg(msgdef, guess_msgtype(msgpath)))
register_types(add_types)
# Type import works only after the register_types call,
# the classname is derived from the msgtype names above.
# pylint: disable=no-name-in-module,wrong-import-position
from rosbags.typesys.types import custom_msgs__msg__Accel as Accel # type: ignore # noqa
from rosbags.typesys.types import custom_msgs__msg__Speed as Speed # type: ignore # noqa
# pylint: enable=no-name-in-module,wrong-import-position

View File

@ -0,0 +1,29 @@
"""Example: Register rosbag1 types."""
from __future__ import annotations
from typing import TYPE_CHECKING
from rosbags.rosbag1 import Reader
from rosbags.typesys import get_types_from_msg, register_types
if TYPE_CHECKING:
from pathlib import Path
def process_bag(src: Path) -> None:
"""Register contained messages types before processing bag.
Args:
src: Bag to process.
"""
with Reader(src) as reader:
typs = {}
for conn in reader.connections:
typs.update(get_types_from_msg(conn.msgdef, conn.msgtype))
register_types(typs)
# Now all message types used in the bag are registered
# for conn, timestamp, data in reader.messages():
# ...

View File

@ -0,0 +1,25 @@
"""Example: Register type from definition string."""
from rosbags.serde import serialize_cdr
from rosbags.typesys import get_types_from_msg, register_types
# Your custom message definition
STRIDX_MSG = """
string string
uint32 index
"""
register_types(get_types_from_msg(STRIDX_MSG, 'custom_msgs/msg/StrIdx'))
# Type import works only after the register_types call,
# the classname is derived from the msgtype name above
# pylint: disable=no-name-in-module,wrong-import-position
from rosbags.typesys.types import custom_msgs__msg__StrIdx as StrIdx # type: ignore # noqa
# pylint: enable=no-name-in-module,wrong-import-position
message = StrIdx(string='foo', index=42)
# Rawdata that can be passed to rosbag2.Writer.write
rawdata = serialize_cdr(message, message.__msgtype__)

View File

@ -0,0 +1,16 @@
Save images as rosbag
=====================
The following examples show how to create new ROS bags from images.
Save rosbag1
------------
.. literalinclude:: ./save_images_rosbag1.py
Save rosbag2
------------
.. literalinclude:: ./save_images_rosbag2.py

View File

@ -0,0 +1,43 @@
"""Example: Save images as rosbag1."""
import numpy
from rosbags.rosbag1 import Writer
from rosbags.serde import serialize_ros1
from rosbags.typesys.types import builtin_interfaces__msg__Time as Time
from rosbags.typesys.types import sensor_msgs__msg__CompressedImage as CompressedImage
from rosbags.typesys.types import std_msgs__msg__Header as Header
TOPIC = '/camera'
FRAMEID = 'map'
# Contains filenames and their timestamps
IMAGES = [
('homer.jpg', 42),
('marge.jpg', 43),
]
def save_images() -> None:
"""Iterate over IMAGES and save to output bag."""
with Writer('output.bag') as writer:
conn = writer.add_connection(TOPIC, CompressedImage.__msgtype__)
for path, timestamp in IMAGES:
message = CompressedImage(
Header(
stamp=Time(
sec=int(timestamp // 10**9),
nanosec=int(timestamp % 10**9),
),
frame_id=FRAMEID,
),
format='jpeg', # could also be 'png'
data=numpy.fromfile(path, dtype=numpy.uint8),
)
writer.write(
conn,
timestamp,
serialize_ros1(message, CompressedImage.__msgtype__),
)

View File

@ -0,0 +1,43 @@
"""Save multiple images in rosbag2."""
import numpy
from rosbags.rosbag2 import Writer
from rosbags.serde import serialize_cdr
from rosbags.typesys.types import builtin_interfaces__msg__Time as Time
from rosbags.typesys.types import sensor_msgs__msg__CompressedImage as CompressedImage
from rosbags.typesys.types import std_msgs__msg__Header as Header
TOPIC = '/camera'
FRAMEID = 'map'
# Contains filenames and their timestamps
IMAGES = [
('homer.jpg', 42),
('marge.jpg', 43),
]
def save_images() -> None:
"""Iterate over IMAGES and save to output bag."""
with Writer('output') as writer:
conn = writer.add_connection(TOPIC, CompressedImage.__msgtype__, 'cdr', '')
for path, timestamp in IMAGES:
message = CompressedImage(
Header(
stamp=Time(
sec=int(timestamp // 10**9),
nanosec=int(timestamp % 10**9),
),
frame_id=FRAMEID,
),
format='jpeg', # could also be 'png'
data=numpy.fromfile(path, dtype=numpy.uint8),
)
writer.write(
conn,
timestamp,
serialize_cdr(message, message.__msgtype__),
)

View File

@ -0,0 +1,66 @@
"""Example: Message instance conversion."""
from __future__ import annotations
import importlib
from typing import TYPE_CHECKING
import numpy
if TYPE_CHECKING:
from typing import Any
NATIVE_CLASSES: dict[str, Any] = {}
def to_native(msg: Any) -> Any: # noqa: ANN401
"""Convert rosbags message to native message.
Args:
msg: Rosbags message.
Returns:
Native message.
"""
msgtype: str = msg.__msgtype__
if msgtype not in NATIVE_CLASSES:
pkg, name = msgtype.rsplit('/', 1)
NATIVE_CLASSES[msgtype] = getattr(importlib.import_module(pkg.replace('/', '.')), name)
fields = {}
for name, field in msg.__dataclass_fields__.items():
if 'ClassVar' in field.type:
continue
value = getattr(msg, name)
if '__msg__' in field.type:
value = to_native(value)
elif isinstance(value, numpy.ndarray):
value = value.tolist()
fields[name] = value
return NATIVE_CLASSES[msgtype](**fields)
if __name__ == '__main__':
from rosbags.typesys.types import (
builtin_interfaces__msg__Time,
sensor_msgs__msg__Image,
std_msgs__msg__Header,
)
image = sensor_msgs__msg__Image(
std_msgs__msg__Header(
builtin_interfaces__msg__Time(42, 666),
'/frame',
),
4,
4,
'rgb8',
False,
4 * 3,
numpy.zeros(4 * 4 * 3, dtype=numpy.uint8),
)
native_image = to_native(image)
# native_image can now be passed to the ROS stack

View File

@ -0,0 +1,10 @@
Use with native stack
=====================
Messages read with rosbags are simple dataclasses that mimic the native ROS2 interface. If you want to pass those messages to the native ROS2 stack, you need to convert them into native objects first.
Message instance conversion
---------------------------
.. literalinclude:: ./use_with_native.py

48
rosbags/docs/index.rst Normal file
View File

@ -0,0 +1,48 @@
.. include:: ../README.rst
:end-before: Documentation
.. include:: ../README.rst
:start-after: .. end documentation
.. toctree::
:caption: Documentation
:maxdepth: 1
:hidden:
topics/highlevel
topics/typesys
topics/serde
topics/rosbag2
topics/rosbag1
topics/convert
.. toctree::
:caption: Usage examples
:maxdepth: 0
:hidden:
:glob:
examples/*
.. toctree::
:caption: API
:glob:
:hidden:
api/rosbags
.. toctree::
:caption: Changes
:hidden:
changes
.. toctree::
:caption: Links
:hidden:
Source Code <https://gitlab.com/ternaris/rosbags>
Issues <https://gitlab.com/ternaris/rosbags/issues>

View File

@ -0,0 +1,36 @@
Convert rosbag versions
=======================
The :py:mod:`rosbags.convert` package includes a CLI tool to convert legacy rosbag1 files to rosbag2 and vice versa.
Features
--------
- Reasonably fast, as it converts raw ROS1 messages to raw CDR messages without going though deserialization and serialization
- Tries to match ROS1 message type names to registered ROS2 types
- Automatically registers unknown message types present in the legacy rosbag file for the conversion
- Handles differences of ``std_msgs/msg/Header`` between both ROS versions
Limitations
-----------
- Refuses to convert unindexed rosbag1 files, please reindex files before conversion
- Currently does not handle split bags
- Only ROS2 default message types are supported when converting rosbag2 to rosbag1
Usage
-----
.. code-block:: console
# Convert "foo.bag", result will be "foo/"
$ rosbags-convert foo.bag
# Convert "bar", result will be "bar.bag"
$ rosbags-convert bar
# Convert "foo.bag", save the result as "bar"
$ rosbags-convert foo.bag --dst /path/to/bar
# Convert "bar", save the result as "foo.bag"
$ rosbags-convert bar --dst /path/to/foo.bag

View File

@ -0,0 +1,22 @@
Highlevel APIs
==============
The :py:mod:`rosbags.highlevel` package provides classes that abstract the complexity of ROS types, serialization and message access into single easy-to-use interfaces.
All in one reader
-----------------
Instances of the :py:class:`AnyReader <rosbags.highlevel.AnyReader>` class give unified access to ROS1 and ROS2 bag files. If a bag file includes message definitions the reader auto-registers all messages into a blank type store, otherwise it falls back to the default type store. It also exposes appropriate deserialization methods on the reader instance itself.
.. code-block:: python
from pathlib import Path
from rosbags.highlevel import AnyReader
# create reader instance and open for reading
with AnyReader([Path('/home/ros/rosbag_2020_03_24')]) as reader:
connections = [x for x in reader.connections if x.topic == '/imu_raw/Imu']
for connection, timestamp, rawdata in reader.messages(connections=connections):
msg = reader.deserialize(rawdata, connection.msgtype)
print(msg.header.frame_id)
AnyReader takes a list of ``pathlib.Path`` instances as arguments. It can take either one ROS2 bag file or one or more ROS1 bag files belonging to a split bag. The reader will replay ROS1 split bags in correct timestamp order.

View File

@ -0,0 +1,54 @@
Rosbag1
=======
The :py:mod:`rosbags.rosbag1` package provides fast read-only access to raw messages stored in the legacy bag format. The rosbag1 support is built for a ROS2 world and some APIs and values perform normalizations to mimic ROS2 behavior and make messages originating from rosbag1 and rosbag2 behave identically. Most notably message types are internally renamed to match their ROS2 counterparts.
Writing rosbag1
---------------
Instances of the :py:class:`Writer <rosbags.rosbag1.Writer>` class can create and write to new rosbag1 files. It is usually used as a context manager. Before the first message of a topic can be written, its topic must first be added to the bag. The following example shows the typical usage pattern:
.. code-block:: python
from rosbags.rosbag1 import Writer
from rosbags.serde import cdr_to_ros1, serialize_cdr
from rosbags.typesys.types import std_msgs__msg__String as String
# create writer instance and open for writing
with Writer('/home/ros/rosbag_2020_03_24.bag') as writer:
# add new connection
topic = '/chatter'
msgtype = String.__msgtype__
connection = writer.add_connection(topic, msgtype, latching=True)
# serialize and write message
message = String('hello world')
timestamp = 42
writer.write(connection, timestamp, cdr_to_ros1(serialize_cdr(message, msgtype), msgtype))
Reading rosbag1
---------------
Instances of the :py:class:`Reader <rosbags.rosbag2.Reader>` class are typically used as context managers and provide access to bag metadata and contents after the bag has been opened. The following example shows the typical usage pattern:
.. code-block:: python
from rosbags.rosbag1 import Reader
from rosbags.serde import deserialize_cdr, ros1_to_cdr
# create reader instance
with Reader('/home/ros/rosbag_2020_03_24.bag') as reader:
# topic and msgtype information is available on .connections list
for connection in reader.connections:
print(connection.topic, connection.msgtype)
# iterate over messages
for connection, timestamp, rawdata in reader.messages():
if connection.topic == '/imu_raw/Imu':
msg = deserialize_cdr(ros1_to_cdr(rawdata, connection.msgtype), connection.msgtype)
print(msg.header.frame_id)
# messages() accepts connection filters
connections = [x for x in reader.connections if x.topic == '/imu_raw/Imu']
for connection, timestamp, rawdata in reader.messages(connections=connections):
msg = deserialize_cdr(ros1_to_cdr(rawdata, connection.msgtype), connection.msgtype)
print(msg.header.frame_id)

View File

@ -0,0 +1,70 @@
Rosbag2
=======
The :py:mod:`rosbags.rosbag2` package provides a conformant implementation of rosbag2. It provides read-write access to raw message data saved inside rosbag2 containers, and supports all features present in the C++ reference implementation.
Supported Versions
------------------
All versions up to the current (ROS2 Humble) version 6 are supported.
Supported Features
------------------
Rosbag2 is a flexible format that supports plugging different serialization methods, compression formats, and storage containers together. The rosbag2 C++ reference implementation is build around plugins that provide serialization, compression, and storage. This project implements all rosbag2 core plugins that are distributed with the C++ reference implementation.
:Serializers:
- cdr (without wstring)
:Compressors:
- zstd
:Storages:
- sqlite3
- mcap
Writing rosbag2
---------------
Instances of the :py:class:`Writer <rosbags.rosbag2.Writer>` class can create and write to new rosbag2 files. It is usually used as a context manager. Before the first message of a topic can be written, its topic must first be added to the bag. The following example shows the typical usage pattern:
.. code-block:: python
from rosbags.rosbag2 import Writer
from rosbags.serde import serialize_cdr
from rosbags.typesys.types import std_msgs__msg__String as String
# create writer instance and open for writing
with Writer('/home/ros/rosbag_2020_03_24') as writer:
# add new connection
topic = '/chatter'
msgtype = String.__msgtype__
connection = writer.add_connection(topic, msgtype, 'cdr', '')
# serialize and write message
timestamp = 42
message = String('hello world')
writer.write(connection, timestamp, serialize_cdr(message, msgtype))
Reading rosbag2
---------------
Instances of the :py:class:`Reader <rosbags.rosbag2.Reader>` class are used to read rosbag2 metadata and its contents. Most of the metadata is available on Reader instances right away, messages can only be accessed after the bag has been opened. To this end it is recommended to use the Reader as a context manager. The following example shows the typical usage pattern:
.. code-block:: python
from rosbags.rosbag2 import Reader
from rosbags.serde import deserialize_cdr
# create reader instance and open for reading
with Reader('/home/ros/rosbag_2020_03_24') as reader:
# topic and msgtype information is available on .connections list
for connection in reader.connections:
print(connection.topic, connection.msgtype)
# iterate over messages
for connection, timestamp, rawdata in reader.messages():
if connection.topic == '/imu_raw/Imu':
msg = deserialize_cdr(rawdata, connection.msgtype)
print(msg.header.frame_id)
# messages() accepts connection filters
connections = [x for x in reader.connections if x.topic == '/imu_raw/Imu']
for connection, timestamp, rawdata in reader.messages(connections=connections):
msg = deserialize_cdr(rawdata, connection.msgtype)
print(msg.header.frame_id)

View File

@ -0,0 +1,49 @@
Serialization and deserialization
=================================
The serialization and deserialization system :py:mod:`rosbags.serde` supports multiple raw message formats. For each format it provides a pair of functions, one for serialization and one for deserialization. In addition to the data to process each function usually only requires the message type name.
Deserialization
---------------
Deserialize a CDR bytes object using :py:func:`deserialize_cdr() <rosbags.serde.deserialize_cdr>`:
.. code-block:: python
from rosbags.serde import deserialize_cdr
# rawdata is of type bytes and contains serialized message
msg = deserialize_cdr(rawdata, 'geometry_msgs/msg/Quaternion')
Deserialize a ROS1 bytes object using :py:func:`deserialize_ros1() <rosbags.serde.deserialize_ros1>`:
.. code-block:: python
from rosbags.serde import deserialize_ros1
# rawdata is of type bytes and contains serialized message
msg = deserialize_ros1(rawdata, 'geometry_msgs/msg/Quaternion')
Serialization
---------------
Serialize a message with CDR using :py:func:`serialize_cdr() <rosbags.serde.serialize_cdr>`:
.. code-block:: python
from rosbags.serde import serialize_cdr
# serialize message with system endianess
serialized = serialize_cdr(msg, 'geometry_msgs/msg/Quaternion')
# serialize message with explicit endianess
serialized = serialize_cdr(msg, 'geometry_msgs/msg/Quaternion', little_endian=False)
Serialize a message with ROS1 using :py:func:`serialize_ros1() <rosbags.serde.serialize_ros1>`:
.. code-block:: python
from rosbags.serde import serialize_ros1
serialized = serialize_ros1(msg, 'geometry_msgs/msg/Quaternion')

View File

@ -0,0 +1,194 @@
builtin_interfaces
******************
- :py:class:`Duration <rosbags.typesys.types.builtin_interfaces__msg__Duration>`
- :py:class:`Time <rosbags.typesys.types.builtin_interfaces__msg__Time>`
diagnostic_msgs
***************
- :py:class:`DiagnosticArray <rosbags.typesys.types.diagnostic_msgs__msg__DiagnosticArray>`
- :py:class:`DiagnosticStatus <rosbags.typesys.types.diagnostic_msgs__msg__DiagnosticStatus>`
- :py:class:`KeyValue <rosbags.typesys.types.diagnostic_msgs__msg__KeyValue>`
geometry_msgs
*************
- :py:class:`Accel <rosbags.typesys.types.geometry_msgs__msg__Accel>`
- :py:class:`AccelStamped <rosbags.typesys.types.geometry_msgs__msg__AccelStamped>`
- :py:class:`AccelWithCovariance <rosbags.typesys.types.geometry_msgs__msg__AccelWithCovariance>`
- :py:class:`AccelWithCovarianceStamped <rosbags.typesys.types.geometry_msgs__msg__AccelWithCovarianceStamped>`
- :py:class:`Inertia <rosbags.typesys.types.geometry_msgs__msg__Inertia>`
- :py:class:`InertiaStamped <rosbags.typesys.types.geometry_msgs__msg__InertiaStamped>`
- :py:class:`Point <rosbags.typesys.types.geometry_msgs__msg__Point>`
- :py:class:`Point32 <rosbags.typesys.types.geometry_msgs__msg__Point32>`
- :py:class:`PointStamped <rosbags.typesys.types.geometry_msgs__msg__PointStamped>`
- :py:class:`Polygon <rosbags.typesys.types.geometry_msgs__msg__Polygon>`
- :py:class:`PolygonStamped <rosbags.typesys.types.geometry_msgs__msg__PolygonStamped>`
- :py:class:`Pose <rosbags.typesys.types.geometry_msgs__msg__Pose>`
- :py:class:`Pose2D <rosbags.typesys.types.geometry_msgs__msg__Pose2D>`
- :py:class:`PoseArray <rosbags.typesys.types.geometry_msgs__msg__PoseArray>`
- :py:class:`PoseStamped <rosbags.typesys.types.geometry_msgs__msg__PoseStamped>`
- :py:class:`PoseWithCovariance <rosbags.typesys.types.geometry_msgs__msg__PoseWithCovariance>`
- :py:class:`PoseWithCovarianceStamped <rosbags.typesys.types.geometry_msgs__msg__PoseWithCovarianceStamped>`
- :py:class:`Quaternion <rosbags.typesys.types.geometry_msgs__msg__Quaternion>`
- :py:class:`QuaternionStamped <rosbags.typesys.types.geometry_msgs__msg__QuaternionStamped>`
- :py:class:`Transform <rosbags.typesys.types.geometry_msgs__msg__Transform>`
- :py:class:`TransformStamped <rosbags.typesys.types.geometry_msgs__msg__TransformStamped>`
- :py:class:`Twist <rosbags.typesys.types.geometry_msgs__msg__Twist>`
- :py:class:`TwistStamped <rosbags.typesys.types.geometry_msgs__msg__TwistStamped>`
- :py:class:`TwistWithCovariance <rosbags.typesys.types.geometry_msgs__msg__TwistWithCovariance>`
- :py:class:`TwistWithCovarianceStamped <rosbags.typesys.types.geometry_msgs__msg__TwistWithCovarianceStamped>`
- :py:class:`Vector3 <rosbags.typesys.types.geometry_msgs__msg__Vector3>`
- :py:class:`Vector3Stamped <rosbags.typesys.types.geometry_msgs__msg__Vector3Stamped>`
- :py:class:`Wrench <rosbags.typesys.types.geometry_msgs__msg__Wrench>`
- :py:class:`WrenchStamped <rosbags.typesys.types.geometry_msgs__msg__WrenchStamped>`
libstatistics_collector
***********************
- :py:class:`DummyMessage <rosbags.typesys.types.libstatistics_collector__msg__DummyMessage>`
lifecycle_msgs
**************
- :py:class:`State <rosbags.typesys.types.lifecycle_msgs__msg__State>`
- :py:class:`Transition <rosbags.typesys.types.lifecycle_msgs__msg__Transition>`
- :py:class:`TransitionDescription <rosbags.typesys.types.lifecycle_msgs__msg__TransitionDescription>`
- :py:class:`TransitionEvent <rosbags.typesys.types.lifecycle_msgs__msg__TransitionEvent>`
nav_msgs
********
- :py:class:`GridCells <rosbags.typesys.types.nav_msgs__msg__GridCells>`
- :py:class:`MapMetaData <rosbags.typesys.types.nav_msgs__msg__MapMetaData>`
- :py:class:`OccupancyGrid <rosbags.typesys.types.nav_msgs__msg__OccupancyGrid>`
- :py:class:`Odometry <rosbags.typesys.types.nav_msgs__msg__Odometry>`
- :py:class:`Path <rosbags.typesys.types.nav_msgs__msg__Path>`
rcl_interfaces
**************
- :py:class:`FloatingPointRange <rosbags.typesys.types.rcl_interfaces__msg__FloatingPointRange>`
- :py:class:`IntegerRange <rosbags.typesys.types.rcl_interfaces__msg__IntegerRange>`
- :py:class:`ListParametersResult <rosbags.typesys.types.rcl_interfaces__msg__ListParametersResult>`
- :py:class:`Log <rosbags.typesys.types.rcl_interfaces__msg__Log>`
- :py:class:`Parameter <rosbags.typesys.types.rcl_interfaces__msg__Parameter>`
- :py:class:`ParameterDescriptor <rosbags.typesys.types.rcl_interfaces__msg__ParameterDescriptor>`
- :py:class:`ParameterEvent <rosbags.typesys.types.rcl_interfaces__msg__ParameterEvent>`
- :py:class:`ParameterEventDescriptors <rosbags.typesys.types.rcl_interfaces__msg__ParameterEventDescriptors>`
- :py:class:`ParameterType <rosbags.typesys.types.rcl_interfaces__msg__ParameterType>`
- :py:class:`ParameterValue <rosbags.typesys.types.rcl_interfaces__msg__ParameterValue>`
- :py:class:`SetParametersResult <rosbags.typesys.types.rcl_interfaces__msg__SetParametersResult>`
rmw_dds_common
**************
- :py:class:`Gid <rosbags.typesys.types.rmw_dds_common__msg__Gid>`
- :py:class:`NodeEntitiesInfo <rosbags.typesys.types.rmw_dds_common__msg__NodeEntitiesInfo>`
- :py:class:`ParticipantEntitiesInfo <rosbags.typesys.types.rmw_dds_common__msg__ParticipantEntitiesInfo>`
rosgraph_msgs
*************
- :py:class:`Clock <rosbags.typesys.types.rosgraph_msgs__msg__Clock>`
sensor_msgs
***********
- :py:class:`BatteryState <rosbags.typesys.types.sensor_msgs__msg__BatteryState>`
- :py:class:`CameraInfo <rosbags.typesys.types.sensor_msgs__msg__CameraInfo>`
- :py:class:`ChannelFloat32 <rosbags.typesys.types.sensor_msgs__msg__ChannelFloat32>`
- :py:class:`CompressedImage <rosbags.typesys.types.sensor_msgs__msg__CompressedImage>`
- :py:class:`FluidPressure <rosbags.typesys.types.sensor_msgs__msg__FluidPressure>`
- :py:class:`Illuminance <rosbags.typesys.types.sensor_msgs__msg__Illuminance>`
- :py:class:`Image <rosbags.typesys.types.sensor_msgs__msg__Image>`
- :py:class:`Imu <rosbags.typesys.types.sensor_msgs__msg__Imu>`
- :py:class:`JointState <rosbags.typesys.types.sensor_msgs__msg__JointState>`
- :py:class:`Joy <rosbags.typesys.types.sensor_msgs__msg__Joy>`
- :py:class:`JoyFeedback <rosbags.typesys.types.sensor_msgs__msg__JoyFeedback>`
- :py:class:`JoyFeedbackArray <rosbags.typesys.types.sensor_msgs__msg__JoyFeedbackArray>`
- :py:class:`LaserEcho <rosbags.typesys.types.sensor_msgs__msg__LaserEcho>`
- :py:class:`LaserScan <rosbags.typesys.types.sensor_msgs__msg__LaserScan>`
- :py:class:`MagneticField <rosbags.typesys.types.sensor_msgs__msg__MagneticField>`
- :py:class:`MultiDOFJointState <rosbags.typesys.types.sensor_msgs__msg__MultiDOFJointState>`
- :py:class:`MultiEchoLaserScan <rosbags.typesys.types.sensor_msgs__msg__MultiEchoLaserScan>`
- :py:class:`NavSatFix <rosbags.typesys.types.sensor_msgs__msg__NavSatFix>`
- :py:class:`NavSatStatus <rosbags.typesys.types.sensor_msgs__msg__NavSatStatus>`
- :py:class:`PointCloud <rosbags.typesys.types.sensor_msgs__msg__PointCloud>`
- :py:class:`PointCloud2 <rosbags.typesys.types.sensor_msgs__msg__PointCloud2>`
- :py:class:`PointField <rosbags.typesys.types.sensor_msgs__msg__PointField>`
- :py:class:`Range <rosbags.typesys.types.sensor_msgs__msg__Range>`
- :py:class:`RegionOfInterest <rosbags.typesys.types.sensor_msgs__msg__RegionOfInterest>`
- :py:class:`RelativeHumidity <rosbags.typesys.types.sensor_msgs__msg__RelativeHumidity>`
- :py:class:`Temperature <rosbags.typesys.types.sensor_msgs__msg__Temperature>`
- :py:class:`TimeReference <rosbags.typesys.types.sensor_msgs__msg__TimeReference>`
shape_msgs
**********
- :py:class:`Mesh <rosbags.typesys.types.shape_msgs__msg__Mesh>`
- :py:class:`MeshTriangle <rosbags.typesys.types.shape_msgs__msg__MeshTriangle>`
- :py:class:`Plane <rosbags.typesys.types.shape_msgs__msg__Plane>`
- :py:class:`SolidPrimitive <rosbags.typesys.types.shape_msgs__msg__SolidPrimitive>`
statistics_msgs
***************
- :py:class:`MetricsMessage <rosbags.typesys.types.statistics_msgs__msg__MetricsMessage>`
- :py:class:`StatisticDataPoint <rosbags.typesys.types.statistics_msgs__msg__StatisticDataPoint>`
- :py:class:`StatisticDataType <rosbags.typesys.types.statistics_msgs__msg__StatisticDataType>`
std_msgs
********
- :py:class:`Bool <rosbags.typesys.types.std_msgs__msg__Bool>`
- :py:class:`Byte <rosbags.typesys.types.std_msgs__msg__Byte>`
- :py:class:`ByteMultiArray <rosbags.typesys.types.std_msgs__msg__ByteMultiArray>`
- :py:class:`Char <rosbags.typesys.types.std_msgs__msg__Char>`
- :py:class:`ColorRGBA <rosbags.typesys.types.std_msgs__msg__ColorRGBA>`
- :py:class:`Empty <rosbags.typesys.types.std_msgs__msg__Empty>`
- :py:class:`Float32 <rosbags.typesys.types.std_msgs__msg__Float32>`
- :py:class:`Float32MultiArray <rosbags.typesys.types.std_msgs__msg__Float32MultiArray>`
- :py:class:`Float64 <rosbags.typesys.types.std_msgs__msg__Float64>`
- :py:class:`Float64MultiArray <rosbags.typesys.types.std_msgs__msg__Float64MultiArray>`
- :py:class:`Header <rosbags.typesys.types.std_msgs__msg__Header>`
- :py:class:`Int16 <rosbags.typesys.types.std_msgs__msg__Int16>`
- :py:class:`Int16MultiArray <rosbags.typesys.types.std_msgs__msg__Int16MultiArray>`
- :py:class:`Int32 <rosbags.typesys.types.std_msgs__msg__Int32>`
- :py:class:`Int32MultiArray <rosbags.typesys.types.std_msgs__msg__Int32MultiArray>`
- :py:class:`Int64 <rosbags.typesys.types.std_msgs__msg__Int64>`
- :py:class:`Int64MultiArray <rosbags.typesys.types.std_msgs__msg__Int64MultiArray>`
- :py:class:`Int8 <rosbags.typesys.types.std_msgs__msg__Int8>`
- :py:class:`Int8MultiArray <rosbags.typesys.types.std_msgs__msg__Int8MultiArray>`
- :py:class:`MultiArrayDimension <rosbags.typesys.types.std_msgs__msg__MultiArrayDimension>`
- :py:class:`MultiArrayLayout <rosbags.typesys.types.std_msgs__msg__MultiArrayLayout>`
- :py:class:`String <rosbags.typesys.types.std_msgs__msg__String>`
- :py:class:`UInt16 <rosbags.typesys.types.std_msgs__msg__UInt16>`
- :py:class:`UInt16MultiArray <rosbags.typesys.types.std_msgs__msg__UInt16MultiArray>`
- :py:class:`UInt32 <rosbags.typesys.types.std_msgs__msg__UInt32>`
- :py:class:`UInt32MultiArray <rosbags.typesys.types.std_msgs__msg__UInt32MultiArray>`
- :py:class:`UInt64 <rosbags.typesys.types.std_msgs__msg__UInt64>`
- :py:class:`UInt64MultiArray <rosbags.typesys.types.std_msgs__msg__UInt64MultiArray>`
- :py:class:`UInt8 <rosbags.typesys.types.std_msgs__msg__UInt8>`
- :py:class:`UInt8MultiArray <rosbags.typesys.types.std_msgs__msg__UInt8MultiArray>`
stereo_msgs
***********
- :py:class:`DisparityImage <rosbags.typesys.types.stereo_msgs__msg__DisparityImage>`
tf2_msgs
********
- :py:class:`TF2Error <rosbags.typesys.types.tf2_msgs__msg__TF2Error>`
- :py:class:`TFMessage <rosbags.typesys.types.tf2_msgs__msg__TFMessage>`
trajectory_msgs
***************
- :py:class:`JointTrajectory <rosbags.typesys.types.trajectory_msgs__msg__JointTrajectory>`
- :py:class:`JointTrajectoryPoint <rosbags.typesys.types.trajectory_msgs__msg__JointTrajectoryPoint>`
- :py:class:`MultiDOFJointTrajectory <rosbags.typesys.types.trajectory_msgs__msg__MultiDOFJointTrajectory>`
- :py:class:`MultiDOFJointTrajectoryPoint <rosbags.typesys.types.trajectory_msgs__msg__MultiDOFJointTrajectoryPoint>`
unique_identifier_msgs
**********************
- :py:class:`UUID <rosbags.typesys.types.unique_identifier_msgs__msg__UUID>`
visualization_msgs
******************
- :py:class:`ImageMarker <rosbags.typesys.types.visualization_msgs__msg__ImageMarker>`
- :py:class:`InteractiveMarker <rosbags.typesys.types.visualization_msgs__msg__InteractiveMarker>`
- :py:class:`InteractiveMarkerControl <rosbags.typesys.types.visualization_msgs__msg__InteractiveMarkerControl>`
- :py:class:`InteractiveMarkerFeedback <rosbags.typesys.types.visualization_msgs__msg__InteractiveMarkerFeedback>`
- :py:class:`InteractiveMarkerInit <rosbags.typesys.types.visualization_msgs__msg__InteractiveMarkerInit>`
- :py:class:`InteractiveMarkerPose <rosbags.typesys.types.visualization_msgs__msg__InteractiveMarkerPose>`
- :py:class:`InteractiveMarkerUpdate <rosbags.typesys.types.visualization_msgs__msg__InteractiveMarkerUpdate>`
- :py:class:`Marker <rosbags.typesys.types.visualization_msgs__msg__Marker>`
- :py:class:`MarkerArray <rosbags.typesys.types.visualization_msgs__msg__MarkerArray>`
- :py:class:`MenuEntry <rosbags.typesys.types.visualization_msgs__msg__MenuEntry>`

View File

@ -0,0 +1,44 @@
Type system
===========
Rosbags ships its own pure python typesystem :py:mod:`rosbags.typesys`. It uses parse trees to represent message definitions internally. It ships its own ``.idl`` and ``.msg`` definition parser to convert message definition files into the internal format.
Out of the box it supports the message types defined by the standard ROS2 distribution. Message types can be parsed and added on the fly during runtime without an additional build step.
Message instances
-----------------
The type system generates a dataclass for each message type. These dataclasses give direct read write access to all mutable fields of a message. Fields should be mutated with care as no type checking is applied during runtime.
.. note::
Limitation: While the type system parses message definitions with array bounds and/or default values, neither bounds nor default values are enforced or assigned to message instances.
Included message types
----------------------
.. include:: ./typesys-types.rst
Extending the type system
-------------------------
Adding custom message types consists of two steps. First, message definitions are converted into parse trees using :py:func:`get_types_from_idl() <rosbags.typesys.get_types_from_idl>` or :py:func:`get_types_from_msg() <rosbags.typesys.get_types_from_msg>`, and second the types are registered in the type system via :py:func:`register_types() <rosbags.typesys.register_types>`. The following example shows how to add messages type definitions from ``.msg`` and ``.idl`` files:
.. code-block:: python
from pathlib import Path
from rosbags.typesys import get_types_from_idl, get_types_from_msg, register_types
idl_text = Path('foo_msgs/msg/Foo.idl').read_text()
msg_text = Path('bar_msgs/msg/Bar.msg').read_text()
# plain dictionary to hold message definitions
add_types = {}
# add all definitions from one idl file
add_types.update(get_types_from_idl(idl_text))
# add definition from one msg file
add_types.update(get_types_from_msg(msg_text, 'bar_msgs/msg/Bar'))
# make types available to rosbags serializers/deserializers
register_types(add_types)

114
rosbags/pyproject.toml Normal file
View File

@ -0,0 +1,114 @@
[build-system]
requires = ["setuptools>=65.4.0", "wheel"]
build-backend = "setuptools.build_meta"
[tool.coverage]
report.exclude_lines = [
"pragma: no cover",
"if TYPE_CHECKING:",
"if __name__ == '__main__':",
]
report.show_missing = true
report.skip_covered = true
run.branch = true
run.source = ["src"]
[tool.flake8]
avoid_escape = false
docstring_convention = "all"
docstring_style = "google"
extend_exclude = ["venv"]
ignore = [
# do not require annotation of `self`
"ANN101",
# do not apply to google convention
"D203",
"D213",
"D215",
"D406",
"D407",
"D408",
"D409",
# handled by B001
"E722",
# allow line break after binary operator
"W504",
]
max_line_length = 100
strictness = "long"
suppress_none_returning = true
[tool.isort]
include_trailing_comma = true
line_length = 100
multi_line_output = 3
[tool.mypy]
explicit_package_bases = true
fast_module_lookup = true
mypy_path = "src"
namespace_packages = true
strict = true
[[tool.mypy.overrides]]
module = "lz4.frame"
ignore_missing_imports = true
[tool.pydocstyle]
convention = "google"
add_select = ["D204", "D400", "D401", "D404", "D413"]
[tool.pylint.'MESSAGES CONTROL']
enable = "all"
disable = [
"duplicate-code",
"locally-disabled",
"suppressed-message",
"ungrouped-imports",
# isort (pylint FAQ)
"wrong-import-order",
# mccabe (pylint FAQ)
"too-many-branches",
# fixme
"fixme",
# pep8-naming (pylint FAQ, keep: invalid-name)
"bad-classmethod-argument",
"bad-mcs-classmethod-argument",
"no-self-argument",
# pycodestyle (pylint FAQ)
"bad-indentation",
"bare-except",
"line-too-long",
"missing-final-newline",
"multiple-statements",
"trailing-whitespace",
"unnecessary-semicolon",
"unneeded-not",
# pydocstyle (pylint FAQ)
"missing-class-docstring",
"missing-function-docstring",
"missing-module-docstring",
# pyflakes (pylint FAQ)
"undefined-variable",
"unused-import",
"unused-variable",
]
[tool.pytest.ini_options]
addopts = ["--cov=src", "--verbose"]
[tool.yapf]
based_on_style = "google"
column_limit = 100
allow_split_before_dict_value = false
dedent_closing_brackets = true
indent_dictionary_value = false

View File

@ -0,0 +1,827 @@
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --extra=dev --generate-hashes --output-file=requirements-dev.txt setup.cfg
#
alabaster==0.7.13 \
--hash=sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3 \
--hash=sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2
# via sphinx
astor==0.8.1 \
--hash=sha256:070a54e890cefb5b3739d19f30f5a5ec840ffc9c50ffa7d23cc9fc1a38ebbfc5 \
--hash=sha256:6a6effda93f4e1ce9f618779b2dd1d9d84f1e32812c23a29b3fff6fd7f63fa5e
# via
# flake8-simplify
# flake8-type-checking
astroid==2.14.2 \
--hash=sha256:0e0e3709d64fbffd3037e4ff403580550f14471fd3eaae9fa11cc9a5c7901153 \
--hash=sha256:a3cf9f02c53dd259144a7e8f3ccd75d67c9a8c716ef183e0c1f291bc5d7bb3cf
# via pylint
attrs==22.2.0 \
--hash=sha256:29e95c7f6778868dbd49170f98f8818f78f3dc5e0e37c0b1f474e3561b240836 \
--hash=sha256:c9227bfc2f01993c03f68db37d1d15c9690188323c067c641f1a35ca58185f99
# via
# flake8-annotations
# flake8-bugbear
# pytest
babel==2.12.1 \
--hash=sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610 \
--hash=sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455
# via sphinx
certifi==2022.12.7 \
--hash=sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3 \
--hash=sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18
# via requests
charset-normalizer==3.0.1 \
--hash=sha256:00d3ffdaafe92a5dc603cb9bd5111aaa36dfa187c8285c543be562e61b755f6b \
--hash=sha256:024e606be3ed92216e2b6952ed859d86b4cfa52cd5bc5f050e7dc28f9b43ec42 \
--hash=sha256:0298eafff88c99982a4cf66ba2efa1128e4ddaca0b05eec4c456bbc7db691d8d \
--hash=sha256:02a51034802cbf38db3f89c66fb5d2ec57e6fe7ef2f4a44d070a593c3688667b \
--hash=sha256:083c8d17153ecb403e5e1eb76a7ef4babfc2c48d58899c98fcaa04833e7a2f9a \
--hash=sha256:0a11e971ed097d24c534c037d298ad32c6ce81a45736d31e0ff0ad37ab437d59 \
--hash=sha256:0bf2dae5291758b6f84cf923bfaa285632816007db0330002fa1de38bfcb7154 \
--hash=sha256:0c0a590235ccd933d9892c627dec5bc7511ce6ad6c1011fdf5b11363022746c1 \
--hash=sha256:0f438ae3532723fb6ead77e7c604be7c8374094ef4ee2c5e03a3a17f1fca256c \
--hash=sha256:109487860ef6a328f3eec66f2bf78b0b72400280d8f8ea05f69c51644ba6521a \
--hash=sha256:11b53acf2411c3b09e6af37e4b9005cba376c872503c8f28218c7243582df45d \
--hash=sha256:12db3b2c533c23ab812c2b25934f60383361f8a376ae272665f8e48b88e8e1c6 \
--hash=sha256:14e76c0f23218b8f46c4d87018ca2e441535aed3632ca134b10239dfb6dadd6b \
--hash=sha256:16a8663d6e281208d78806dbe14ee9903715361cf81f6d4309944e4d1e59ac5b \
--hash=sha256:292d5e8ba896bbfd6334b096e34bffb56161c81408d6d036a7dfa6929cff8783 \
--hash=sha256:2c03cc56021a4bd59be889c2b9257dae13bf55041a3372d3295416f86b295fb5 \
--hash=sha256:2e396d70bc4ef5325b72b593a72c8979999aa52fb8bcf03f701c1b03e1166918 \
--hash=sha256:2edb64ee7bf1ed524a1da60cdcd2e1f6e2b4f66ef7c077680739f1641f62f555 \
--hash=sha256:31a9ddf4718d10ae04d9b18801bd776693487cbb57d74cc3458a7673f6f34639 \
--hash=sha256:356541bf4381fa35856dafa6a965916e54bed415ad8a24ee6de6e37deccf2786 \
--hash=sha256:358a7c4cb8ba9b46c453b1dd8d9e431452d5249072e4f56cfda3149f6ab1405e \
--hash=sha256:37f8febc8ec50c14f3ec9637505f28e58d4f66752207ea177c1d67df25da5aed \
--hash=sha256:39049da0ffb96c8cbb65cbf5c5f3ca3168990adf3551bd1dee10c48fce8ae820 \
--hash=sha256:39cf9ed17fe3b1bc81f33c9ceb6ce67683ee7526e65fde1447c772afc54a1bb8 \
--hash=sha256:3ae1de54a77dc0d6d5fcf623290af4266412a7c4be0b1ff7444394f03f5c54e3 \
--hash=sha256:3b590df687e3c5ee0deef9fc8c547d81986d9a1b56073d82de008744452d6541 \
--hash=sha256:3e45867f1f2ab0711d60c6c71746ac53537f1684baa699f4f668d4c6f6ce8e14 \
--hash=sha256:3fc1c4a2ffd64890aebdb3f97e1278b0cc72579a08ca4de8cd2c04799a3a22be \
--hash=sha256:4457ea6774b5611f4bed5eaa5df55f70abde42364d498c5134b7ef4c6958e20e \
--hash=sha256:44ba614de5361b3e5278e1241fda3dc1838deed864b50a10d7ce92983797fa76 \
--hash=sha256:4a8fcf28c05c1f6d7e177a9a46a1c52798bfe2ad80681d275b10dcf317deaf0b \
--hash=sha256:4b0d02d7102dd0f997580b51edc4cebcf2ab6397a7edf89f1c73b586c614272c \
--hash=sha256:502218f52498a36d6bf5ea77081844017bf7982cdbe521ad85e64cabee1b608b \
--hash=sha256:503e65837c71b875ecdd733877d852adbc465bd82c768a067badd953bf1bc5a3 \
--hash=sha256:5995f0164fa7df59db4746112fec3f49c461dd6b31b841873443bdb077c13cfc \
--hash=sha256:59e5686dd847347e55dffcc191a96622f016bc0ad89105e24c14e0d6305acbc6 \
--hash=sha256:601f36512f9e28f029d9481bdaf8e89e5148ac5d89cffd3b05cd533eeb423b59 \
--hash=sha256:608862a7bf6957f2333fc54ab4399e405baad0163dc9f8d99cb236816db169d4 \
--hash=sha256:62595ab75873d50d57323a91dd03e6966eb79c41fa834b7a1661ed043b2d404d \
--hash=sha256:70990b9c51340e4044cfc394a81f614f3f90d41397104d226f21e66de668730d \
--hash=sha256:71140351489970dfe5e60fc621ada3e0f41104a5eddaca47a7acb3c1b851d6d3 \
--hash=sha256:72966d1b297c741541ca8cf1223ff262a6febe52481af742036a0b296e35fa5a \
--hash=sha256:74292fc76c905c0ef095fe11e188a32ebd03bc38f3f3e9bcb85e4e6db177b7ea \
--hash=sha256:761e8904c07ad053d285670f36dd94e1b6ab7f16ce62b9805c475b7aa1cffde6 \
--hash=sha256:772b87914ff1152b92a197ef4ea40efe27a378606c39446ded52c8f80f79702e \
--hash=sha256:79909e27e8e4fcc9db4addea88aa63f6423ebb171db091fb4373e3312cb6d603 \
--hash=sha256:7e189e2e1d3ed2f4aebabd2d5b0f931e883676e51c7624826e0a4e5fe8a0bf24 \
--hash=sha256:7eb33a30d75562222b64f569c642ff3dc6689e09adda43a082208397f016c39a \
--hash=sha256:81d6741ab457d14fdedc215516665050f3822d3e56508921cc7239f8c8e66a58 \
--hash=sha256:8499ca8f4502af841f68135133d8258f7b32a53a1d594aa98cc52013fff55678 \
--hash=sha256:84c3990934bae40ea69a82034912ffe5a62c60bbf6ec5bc9691419641d7d5c9a \
--hash=sha256:87701167f2a5c930b403e9756fab1d31d4d4da52856143b609e30a1ce7160f3c \
--hash=sha256:88600c72ef7587fe1708fd242b385b6ed4b8904976d5da0893e31df8b3480cb6 \
--hash=sha256:8ac7b6a045b814cf0c47f3623d21ebd88b3e8cf216a14790b455ea7ff0135d18 \
--hash=sha256:8b8af03d2e37866d023ad0ddea594edefc31e827fee64f8de5611a1dbc373174 \
--hash=sha256:8c7fe7afa480e3e82eed58e0ca89f751cd14d767638e2550c77a92a9e749c317 \
--hash=sha256:8eade758719add78ec36dc13201483f8e9b5d940329285edcd5f70c0a9edbd7f \
--hash=sha256:911d8a40b2bef5b8bbae2e36a0b103f142ac53557ab421dc16ac4aafee6f53dc \
--hash=sha256:93ad6d87ac18e2a90b0fe89df7c65263b9a99a0eb98f0a3d2e079f12a0735837 \
--hash=sha256:95dea361dd73757c6f1c0a1480ac499952c16ac83f7f5f4f84f0658a01b8ef41 \
--hash=sha256:9ab77acb98eba3fd2a85cd160851816bfce6871d944d885febf012713f06659c \
--hash=sha256:9cb3032517f1627cc012dbc80a8ec976ae76d93ea2b5feaa9d2a5b8882597579 \
--hash=sha256:9cf4e8ad252f7c38dd1f676b46514f92dc0ebeb0db5552f5f403509705e24753 \
--hash=sha256:9d9153257a3f70d5f69edf2325357251ed20f772b12e593f3b3377b5f78e7ef8 \
--hash=sha256:a152f5f33d64a6be73f1d30c9cc82dfc73cec6477ec268e7c6e4c7d23c2d2291 \
--hash=sha256:a16418ecf1329f71df119e8a65f3aa68004a3f9383821edcb20f0702934d8087 \
--hash=sha256:a60332922359f920193b1d4826953c507a877b523b2395ad7bc716ddd386d866 \
--hash=sha256:a8d0fc946c784ff7f7c3742310cc8a57c5c6dc31631269876a88b809dbeff3d3 \
--hash=sha256:ab5de034a886f616a5668aa5d098af2b5385ed70142090e2a31bcbd0af0fdb3d \
--hash=sha256:c22d3fe05ce11d3671297dc8973267daa0f938b93ec716e12e0f6dee81591dc1 \
--hash=sha256:c2ac1b08635a8cd4e0cbeaf6f5e922085908d48eb05d44c5ae9eabab148512ca \
--hash=sha256:c512accbd6ff0270939b9ac214b84fb5ada5f0409c44298361b2f5e13f9aed9e \
--hash=sha256:c75ffc45f25324e68ab238cb4b5c0a38cd1c3d7f1fb1f72b5541de469e2247db \
--hash=sha256:c95a03c79bbe30eec3ec2b7f076074f4281526724c8685a42872974ef4d36b72 \
--hash=sha256:cadaeaba78750d58d3cc6ac4d1fd867da6fc73c88156b7a3212a3cd4819d679d \
--hash=sha256:cd6056167405314a4dc3c173943f11249fa0f1b204f8b51ed4bde1a9cd1834dc \
--hash=sha256:db72b07027db150f468fbada4d85b3b2729a3db39178abf5c543b784c1254539 \
--hash=sha256:df2c707231459e8a4028eabcd3cfc827befd635b3ef72eada84ab13b52e1574d \
--hash=sha256:e62164b50f84e20601c1ff8eb55620d2ad25fb81b59e3cd776a1902527a788af \
--hash=sha256:e696f0dd336161fca9adbb846875d40752e6eba585843c768935ba5c9960722b \
--hash=sha256:eaa379fcd227ca235d04152ca6704c7cb55564116f8bc52545ff357628e10602 \
--hash=sha256:ebea339af930f8ca5d7a699b921106c6e29c617fe9606fa7baa043c1cdae326f \
--hash=sha256:f4c39b0e3eac288fedc2b43055cfc2ca7a60362d0e5e87a637beac5d801ef478 \
--hash=sha256:f5057856d21e7586765171eac8b9fc3f7d44ef39425f85dbcccb13b3ebea806c \
--hash=sha256:f6f45710b4459401609ebebdbcfb34515da4fc2aa886f95107f556ac69a9147e \
--hash=sha256:f97e83fa6c25693c7a35de154681fcc257c1c41b38beb0304b9c4d2d9e164479 \
--hash=sha256:f9d0c5c045a3ca9bedfc35dca8526798eb91a07aa7a2c0fee134c6c6f321cbd7 \
--hash=sha256:ff6f3db31555657f3163b15a6b7c6938d08df7adbfc9dd13d9d19edad678f1e8
# via requests
classify-imports==4.2.0 \
--hash=sha256:7abfb7ea92149b29d046bd34573d247ba6e68cc28100c801eba4af17964fc40e \
--hash=sha256:dbbc264b70a470ed8c6c95976a11dfb8b7f63df44ed1af87328bbed2663f5161
# via flake8-type-checking
coverage[toml]==7.2.1 \
--hash=sha256:0339dc3237c0d31c3b574f19c57985fcbe494280153bbcad33f2cdf469f4ac3e \
--hash=sha256:09643fb0df8e29f7417adc3f40aaf379d071ee8f0350ab290517c7004f05360b \
--hash=sha256:0bd7e628f6c3ec4e7d2d24ec0e50aae4e5ae95ea644e849d92ae4805650b4c4e \
--hash=sha256:0cf557827be7eca1c38a2480484d706693e7bb1929e129785fe59ec155a59de6 \
--hash=sha256:0f8318ed0f3c376cfad8d3520f496946977abde080439d6689d7799791457454 \
--hash=sha256:1b7fb13850ecb29b62a447ac3516c777b0e7a09ecb0f4bb6718a8654c87dfc80 \
--hash=sha256:22c308bc508372576ffa3d2dbc4824bb70d28eeb4fcd79d4d1aed663a06630d0 \
--hash=sha256:3004765bca3acd9e015794e5c2f0c9a05587f5e698127ff95e9cfba0d3f29339 \
--hash=sha256:3a209d512d157379cc9ab697cbdbb4cfd18daa3e7eebaa84c3d20b6af0037384 \
--hash=sha256:436313d129db7cf5b4ac355dd2bd3f7c7e5294af077b090b85de75f8458b8616 \
--hash=sha256:49567ec91fc5e0b15356da07a2feabb421d62f52a9fff4b1ec40e9e19772f5f8 \
--hash=sha256:4dd34a935de268a133e4741827ae951283a28c0125ddcdbcbba41c4b98f2dfef \
--hash=sha256:570c21a29493b350f591a4b04c158ce1601e8d18bdcd21db136fbb135d75efa6 \
--hash=sha256:5928b85416a388dd557ddc006425b0c37e8468bd1c3dc118c1a3de42f59e2a54 \
--hash=sha256:5d2b9b5e70a21474c105a133ba227c61bc95f2ac3b66861143ce39a5ea4b3f84 \
--hash=sha256:617a94ada56bbfe547aa8d1b1a2b8299e2ec1ba14aac1d4b26a9f7d6158e1273 \
--hash=sha256:6a034480e9ebd4e83d1aa0453fd78986414b5d237aea89a8fdc35d330aa13bae \
--hash=sha256:6fce673f79a0e017a4dc35e18dc7bb90bf6d307c67a11ad5e61ca8d42b87cbff \
--hash=sha256:78d2c3dde4c0b9be4b02067185136b7ee4681978228ad5ec1278fa74f5ca3e99 \
--hash=sha256:7f099da6958ddfa2ed84bddea7515cb248583292e16bb9231d151cd528eab657 \
--hash=sha256:80559eaf6c15ce3da10edb7977a1548b393db36cbc6cf417633eca05d84dd1ed \
--hash=sha256:834c2172edff5a08d78e2f53cf5e7164aacabeb66b369f76e7bb367ca4e2d993 \
--hash=sha256:861cc85dfbf55a7a768443d90a07e0ac5207704a9f97a8eb753292a7fcbdfcfc \
--hash=sha256:8649371570551d2fd7dee22cfbf0b61f1747cdfb2b7587bb551e4beaaa44cb97 \
--hash=sha256:87dc37f16fb5e3a28429e094145bf7c1753e32bb50f662722e378c5851f7fdc6 \
--hash=sha256:8a6450da4c7afc4534305b2b7d8650131e130610cea448ff240b6ab73d7eab63 \
--hash=sha256:8d3843ca645f62c426c3d272902b9de90558e9886f15ddf5efe757b12dd376f5 \
--hash=sha256:8dca3c1706670297851bca1acff9618455122246bdae623be31eca744ade05ec \
--hash=sha256:97a3189e019d27e914ecf5c5247ea9f13261d22c3bb0cfcfd2a9b179bb36f8b1 \
--hash=sha256:99f4dd81b2bb8fc67c3da68b1f5ee1650aca06faa585cbc6818dbf67893c6d58 \
--hash=sha256:9e872b082b32065ac2834149dc0adc2a2e6d8203080501e1e3c3c77851b466f9 \
--hash=sha256:a81dbcf6c6c877986083d00b834ac1e84b375220207a059ad45d12f6e518a4e3 \
--hash=sha256:abacd0a738e71b20e224861bc87e819ef46fedba2fb01bc1af83dfd122e9c319 \
--hash=sha256:ae82c988954722fa07ec5045c57b6d55bc1a0890defb57cf4a712ced65b26ddd \
--hash=sha256:b0c0d46de5dd97f6c2d1b560bf0fcf0215658097b604f1840365296302a9d1fb \
--hash=sha256:b1991a6d64231a3e5bbe3099fb0dd7c9aeaa4275ad0e0aeff4cb9ef885c62ba2 \
--hash=sha256:b2167d116309f564af56f9aa5e75ef710ef871c5f9b313a83050035097b56820 \
--hash=sha256:bd5a12239c0006252244f94863f1c518ac256160cd316ea5c47fb1a11b25889a \
--hash=sha256:bdd3f2f285ddcf2e75174248b2406189261a79e7fedee2ceeadc76219b6faa0e \
--hash=sha256:c77f2a9093ccf329dd523a9b2b3c854c20d2a3d968b6def3b820272ca6732242 \
--hash=sha256:cb5f152fb14857cbe7f3e8c9a5d98979c4c66319a33cad6e617f0067c9accdc4 \
--hash=sha256:cca7c0b7f5881dfe0291ef09ba7bb1582cb92ab0aeffd8afb00c700bf692415a \
--hash=sha256:d2ef6cae70168815ed91388948b5f4fcc69681480a0061114db737f957719f03 \
--hash=sha256:d9256d4c60c4bbfec92721b51579c50f9e5062c21c12bec56b55292464873508 \
--hash=sha256:e191a63a05851f8bce77bc875e75457f9b01d42843f8bd7feed2fc26bbe60833 \
--hash=sha256:e2b50ebc2b6121edf352336d503357321b9d8738bb7a72d06fc56153fd3f4cd8 \
--hash=sha256:e3ea04b23b114572b98a88c85379e9e9ae031272ba1fb9b532aa934c621626d4 \
--hash=sha256:e4d70c853f0546855f027890b77854508bdb4d6a81242a9d804482e667fff6e6 \
--hash=sha256:f29351393eb05e6326f044a7b45ed8e38cb4dcc38570d12791f271399dc41431 \
--hash=sha256:f3d07edb912a978915576a776756069dede66d012baa503022d3a0adba1b6afa \
--hash=sha256:fac6343bae03b176e9b58104a9810df3cdccd5cfed19f99adfa807ffbf43cf9b
# via pytest-cov
darglint==1.8.1 \
--hash=sha256:080d5106df149b199822e7ee7deb9c012b49891538f14a11be681044f0bb20da \
--hash=sha256:5ae11c259c17b0701618a20c3da343a3eb98b3bc4b5a83d31cdd94f5ebdced8d
# via rosbags (setup.cfg)
dill==0.3.6 \
--hash=sha256:a07ffd2351b8c678dfc4a856a3005f8067aea51d6ba6c700796a4d9e280f39f0 \
--hash=sha256:e5db55f3687856d8fbdab002ed78544e1c4559a130302693d839dfe8f93f2373
# via pylint
docutils==0.18.1 \
--hash=sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c \
--hash=sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06
# via
# sphinx
# sphinx-rtd-theme
exceptiongroup==1.1.0 \
--hash=sha256:327cbda3da756e2de031a3107b81ab7b3770a602c4d16ca618298c526f4bec1e \
--hash=sha256:bcb67d800a4497e1b404c2dd44fca47d3b7a5e5433dbab67f96c1a685cdfdf23
# via pytest
flake8==6.0.0 \
--hash=sha256:3833794e27ff64ea4e9cf5d410082a8b97ff1a06c16aa3d2027339cd0f1195c7 \
--hash=sha256:c61007e76655af75e6785a931f452915b371dc48f56efd765247c8fe68f2b181
# via
# flake8-annotations
# flake8-bugbear
# flake8-commas
# flake8-comprehensions
# flake8-docstrings
# flake8-isort
# flake8-mutable
# flake8-print
# flake8-pyprojecttoml
# flake8-quotes
# flake8-simplify
# flake8-type-checking
# flake8-use-fstring
# pep8-naming
# rosbags (setup.cfg)
flake8-annotations==3.0.0 \
--hash=sha256:88c8b35a0db10b9a92be69ed3f81494509a18db1c3162551e57bc0fc35fab065 \
--hash=sha256:ea927d31016515e9aa6e256651d74baeeee6fa4ad3f8383715ec5c0460a4c225
# via rosbags (setup.cfg)
flake8-bugbear==23.2.13 \
--hash=sha256:39259814a83f33c8409417ee12dd4050c9c0bb4c8707c12fc18ae62b2f3ddee1 \
--hash=sha256:f136bd0ca2684f101168bba2310dec541e11aa6b252260c17dcf58d18069a740
# via rosbags (setup.cfg)
flake8-commas==2.1.0 \
--hash=sha256:940441ab8ee544df564ae3b3f49f20462d75d5c7cac2463e0b27436e2050f263 \
--hash=sha256:ebb96c31e01d0ef1d0685a21f3f0e2f8153a0381430e748bf0bbbb5d5b453d54
# via rosbags (setup.cfg)
flake8-comprehensions==3.10.1 \
--hash=sha256:412052ac4a947f36b891143430fef4859705af11b2572fbb689f90d372cf26ab \
--hash=sha256:d763de3c74bc18a79c039a7ec732e0a1985b0c79309ceb51e56401ad0a2cd44e
# via rosbags (setup.cfg)
flake8-docstrings==1.7.0 \
--hash=sha256:4c8cc748dc16e6869728699e5d0d685da9a10b0ea718e090b1ba088e67a941af \
--hash=sha256:51f2344026da083fc084166a9353f5082b01f72901df422f74b4d953ae88ac75
# via rosbags (setup.cfg)
flake8-fixme==1.1.1 \
--hash=sha256:226a6f2ef916730899f29ac140bed5d4a17e5aba79f00a0e3ae1eff1997cb1ac \
--hash=sha256:50cade07d27a4c30d4f12351478df87339e67640c83041b664724bda6d16f33a
# via rosbags (setup.cfg)
flake8-isort==6.0.0 \
--hash=sha256:537f453a660d7e903f602ecfa36136b140de279df58d02eb1b6a0c84e83c528c \
--hash=sha256:aa0cac02a62c7739e370ce6b9c31743edac904bae4b157274511fc8a19c75bbc
# via rosbags (setup.cfg)
flake8-mutable==1.2.0 \
--hash=sha256:38fd9dadcbcda6550a916197bc40ed76908119dabb37fbcca30873666c31d2d5 \
--hash=sha256:ee9b77111b867d845177bbc289d87d541445ffcc6029a0c5c65865b42b18c6a6
# via rosbags (setup.cfg)
flake8-plugin-utils==1.3.2 \
--hash=sha256:1fe43e3e9acf3a7c0f6b88f5338cad37044d2f156c43cb6b080b5f9da8a76f06 \
--hash=sha256:20fa2a8ca2decac50116edb42e6af0a1253ef639ad79941249b840531889c65a
# via
# flake8-pytest-style
# flake8-return
flake8-print==5.0.0 \
--hash=sha256:76915a2a389cc1c0879636c219eb909c38501d3a43cc8dae542081c9ba48bdf9 \
--hash=sha256:84a1a6ea10d7056b804221ac5e62b1cee1aefc897ce16f2e5c42d3046068f5d8
# via rosbags (setup.cfg)
flake8-pyprojecttoml==0.0.2 \
--hash=sha256:37b9a8e5274d04591fbecc0782f626fb31b8d266e26192ce18ada25dabfc9f1d \
--hash=sha256:b78dd64254e2d6aa596b5be4e2a41b81147cafec57760de1c7322d025159983c
# via rosbags (setup.cfg)
flake8-pytest-style==1.7.2 \
--hash=sha256:b924197c99b951315949920b0e5547f34900b1844348432e67a44ab191582109 \
--hash=sha256:f5d2aa3219163a052dd92226589d45fab8ea027a3269922f0c4029f548ea5cd1
# via rosbags (setup.cfg)
flake8-quotes==3.3.2 \
--hash=sha256:6e26892b632dacba517bf27219c459a8396dcfac0f5e8204904c5a4ba9b480e1
# via rosbags (setup.cfg)
flake8-return==1.2.0 \
--hash=sha256:1f07af12954ed03ebe2c2aac2418f78b55374e9929d4956109664588f31582a1 \
--hash=sha256:68dfa56582cd704febd02ad86dcf5df67e38e0836d62f1ceae7930d76d3dd955
# via rosbags (setup.cfg)
flake8-simplify==0.19.3 \
--hash=sha256:1057320e9312d75849541fee822900d27bcad05b2405edc84713affee635629e \
--hash=sha256:2fb083bf5142a98d9c9554755cf2f56f8926eb4a33eae30c0809041b1546879e
# via rosbags (setup.cfg)
flake8-type-checking==2.3.0 \
--hash=sha256:7117b8a22d64db02f9d8c724df5d2517e59c6290b034cfa54496c7ae73c07f51 \
--hash=sha256:f802c9933b2a98b96fc4a0b3b90ef0f8379625f867cb73633c09fc2bf746333b
# via rosbags (setup.cfg)
flake8-use-fstring==1.4 \
--hash=sha256:6550bf722585eb97dffa8343b0f1c372101f5c4ab5b07ebf0edd1c79880cdd39
# via rosbags (setup.cfg)
idna==3.4 \
--hash=sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4 \
--hash=sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2
# via requests
imagesize==1.4.1 \
--hash=sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b \
--hash=sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a
# via sphinx
importlib-metadata==6.0.0 \
--hash=sha256:7efb448ec9a5e313a57655d35aa54cd3e01b7e1fbcf72dce1bf06119420f5bad \
--hash=sha256:e354bedeb60efa6affdcc8ae121b73544a7aa74156d047311948f6d711cd378d
# via sphinx
iniconfig==2.0.0 \
--hash=sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3 \
--hash=sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374
# via pytest
isort==5.12.0 \
--hash=sha256:8bef7dde241278824a6d83f44a544709b065191b95b6e50894bdc722fcba0504 \
--hash=sha256:f84c2818376e66cf843d497486ea8fed8700b340f308f076c6fb1229dff318b6
# via
# flake8-isort
# pylint
jinja2==3.1.2 \
--hash=sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852 \
--hash=sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61
# via sphinx
lazy-object-proxy==1.9.0 \
--hash=sha256:09763491ce220c0299688940f8dc2c5d05fd1f45af1e42e636b2e8b2303e4382 \
--hash=sha256:0a891e4e41b54fd5b8313b96399f8b0e173bbbfc03c7631f01efbe29bb0bcf82 \
--hash=sha256:189bbd5d41ae7a498397287c408617fe5c48633e7755287b21d741f7db2706a9 \
--hash=sha256:18b78ec83edbbeb69efdc0e9c1cb41a3b1b1ed11ddd8ded602464c3fc6020494 \
--hash=sha256:1aa3de4088c89a1b69f8ec0dcc169aa725b0ff017899ac568fe44ddc1396df46 \
--hash=sha256:212774e4dfa851e74d393a2370871e174d7ff0ebc980907723bb67d25c8a7c30 \
--hash=sha256:2d0daa332786cf3bb49e10dc6a17a52f6a8f9601b4cf5c295a4f85854d61de63 \
--hash=sha256:5f83ac4d83ef0ab017683d715ed356e30dd48a93746309c8f3517e1287523ef4 \
--hash=sha256:659fb5809fa4629b8a1ac5106f669cfc7bef26fbb389dda53b3e010d1ac4ebae \
--hash=sha256:660c94ea760b3ce47d1855a30984c78327500493d396eac4dfd8bd82041b22be \
--hash=sha256:66a3de4a3ec06cd8af3f61b8e1ec67614fbb7c995d02fa224813cb7afefee701 \
--hash=sha256:721532711daa7db0d8b779b0bb0318fa87af1c10d7fe5e52ef30f8eff254d0cd \
--hash=sha256:7322c3d6f1766d4ef1e51a465f47955f1e8123caee67dd641e67d539a534d006 \
--hash=sha256:79a31b086e7e68b24b99b23d57723ef7e2c6d81ed21007b6281ebcd1688acb0a \
--hash=sha256:81fc4d08b062b535d95c9ea70dbe8a335c45c04029878e62d744bdced5141586 \
--hash=sha256:8fa02eaab317b1e9e03f69aab1f91e120e7899b392c4fc19807a8278a07a97e8 \
--hash=sha256:9090d8e53235aa280fc9239a86ae3ea8ac58eff66a705fa6aa2ec4968b95c821 \
--hash=sha256:946d27deaff6cf8452ed0dba83ba38839a87f4f7a9732e8f9fd4107b21e6ff07 \
--hash=sha256:9990d8e71b9f6488e91ad25f322898c136b008d87bf852ff65391b004da5e17b \
--hash=sha256:9cd077f3d04a58e83d04b20e334f678c2b0ff9879b9375ed107d5d07ff160171 \
--hash=sha256:9e7551208b2aded9c1447453ee366f1c4070602b3d932ace044715d89666899b \
--hash=sha256:9f5fa4a61ce2438267163891961cfd5e32ec97a2c444e5b842d574251ade27d2 \
--hash=sha256:b40387277b0ed2d0602b8293b94d7257e17d1479e257b4de114ea11a8cb7f2d7 \
--hash=sha256:bfb38f9ffb53b942f2b5954e0f610f1e721ccebe9cce9025a38c8ccf4a5183a4 \
--hash=sha256:cbf9b082426036e19c6924a9ce90c740a9861e2bdc27a4834fd0a910742ac1e8 \
--hash=sha256:d9e25ef10a39e8afe59a5c348a4dbf29b4868ab76269f81ce1674494e2565a6e \
--hash=sha256:db1c1722726f47e10e0b5fdbf15ac3b8adb58c091d12b3ab713965795036985f \
--hash=sha256:e7c21c95cae3c05c14aafffe2865bbd5e377cfc1348c4f7751d9dc9a48ca4bda \
--hash=sha256:e8c6cfb338b133fbdbc5cfaa10fe3c6aeea827db80c978dbd13bc9dd8526b7d4 \
--hash=sha256:ea806fd4c37bf7e7ad82537b0757999264d5f70c45468447bb2b91afdbe73a6e \
--hash=sha256:edd20c5a55acb67c7ed471fa2b5fb66cb17f61430b7a6b9c3b4a1e40293b1671 \
--hash=sha256:f0117049dd1d5635bbff65444496c90e0baa48ea405125c088e93d9cf4525b11 \
--hash=sha256:f0705c376533ed2a9e5e97aacdbfe04cecd71e0aa84c7c0595d02ef93b6e4455 \
--hash=sha256:f12ad7126ae0c98d601a7ee504c1122bcef553d1d5e0c3bfa77b16b3968d2734 \
--hash=sha256:f2457189d8257dd41ae9b434ba33298aec198e30adf2dcdaaa3a28b9994f6adb \
--hash=sha256:f699ac1c768270c9e384e4cbd268d6e67aebcfae6cd623b4d7c3bfde5a35db59
# via astroid
lz4==4.3.2 \
--hash=sha256:0ca83a623c449295bafad745dcd399cea4c55b16b13ed8cfea30963b004016c9 \
--hash=sha256:0f5614d8229b33d4a97cb527db2a1ac81308c6e796e7bdb5d1309127289f69d5 \
--hash=sha256:1c4c100d99eed7c08d4e8852dd11e7d1ec47a3340f49e3a96f8dfbba17ffb300 \
--hash=sha256:1f25eb322eeb24068bb7647cae2b0732b71e5c639e4e4026db57618dcd8279f0 \
--hash=sha256:200d05777d61ba1ff8d29cb51c534a162ea0b4fe6d3c28be3571a0a48ff36080 \
--hash=sha256:31d72731c4ac6ebdce57cd9a5cabe0aecba229c4f31ba3e2c64ae52eee3fdb1c \
--hash=sha256:3a85b430138882f82f354135b98c320dafb96fc8fe4656573d95ab05de9eb092 \
--hash=sha256:4931ab28a0d1c133104613e74eec1b8bb1f52403faabe4f47f93008785c0b929 \
--hash=sha256:4caedeb19e3ede6c7a178968b800f910db6503cb4cb1e9cc9221157572139b49 \
--hash=sha256:65d5c93f8badacfa0456b660285e394e65023ef8071142e0dcbd4762166e1be0 \
--hash=sha256:6b50f096a6a25f3b2edca05aa626ce39979d63c3b160687c8c6d50ac3943d0ba \
--hash=sha256:7211dc8f636ca625abc3d4fb9ab74e5444b92df4f8d58ec83c8868a2b0ff643d \
--hash=sha256:7a9eec24ec7d8c99aab54de91b4a5a149559ed5b3097cf30249b665689b3d402 \
--hash=sha256:7c2df117def1589fba1327dceee51c5c2176a2b5a7040b45e84185ce0c08b6a3 \
--hash=sha256:7e2dc1bd88b60fa09b9b37f08553f45dc2b770c52a5996ea52b2b40f25445676 \
--hash=sha256:83903fe6db92db0be101acedc677aa41a490b561567fe1b3fe68695b2110326c \
--hash=sha256:83acfacab3a1a7ab9694333bcb7950fbeb0be21660d236fd09c8337a50817897 \
--hash=sha256:86480f14a188c37cb1416cdabacfb4e42f7a5eab20a737dac9c4b1c227f3b822 \
--hash=sha256:867664d9ca9bdfce840ac96d46cd8838c9ae891e859eb98ce82fcdf0e103a947 \
--hash=sha256:8df16c9a2377bdc01e01e6de5a6e4bbc66ddf007a6b045688e285d7d9d61d1c9 \
--hash=sha256:8f00a9ba98f6364cadda366ae6469b7b3568c0cced27e16a47ddf6b774169270 \
--hash=sha256:926b26db87ec8822cf1870efc3d04d06062730ec3279bbbd33ba47a6c0a5c673 \
--hash=sha256:a6a46889325fd60b8a6b62ffc61588ec500a1883db32cddee9903edfba0b7584 \
--hash=sha256:a98b61e504fb69f99117b188e60b71e3c94469295571492a6468c1acd63c37ba \
--hash=sha256:ad38dc6a7eea6f6b8b642aaa0683253288b0460b70cab3216838747163fb774d \
--hash=sha256:b10b77dc2e6b1daa2f11e241141ab8285c42b4ed13a8642495620416279cc5b2 \
--hash=sha256:d5ea0e788dc7e2311989b78cae7accf75a580827b4d96bbaf06c7e5a03989bd5 \
--hash=sha256:e05afefc4529e97c08e65ef92432e5f5225c0bb21ad89dee1e06a882f91d7f5e \
--hash=sha256:e1431d84a9cfb23e6773e72078ce8e65cad6745816d4cbf9ae67da5ea419acda \
--hash=sha256:ec6755cacf83f0c5588d28abb40a1ac1643f2ff2115481089264c7630236618a \
--hash=sha256:edc2fb3463d5d9338ccf13eb512aab61937be50aa70734bcf873f2f493801d3b \
--hash=sha256:edd8987d8415b5dad25e797043936d91535017237f72fa456601be1479386c92 \
--hash=sha256:edda4fb109439b7f3f58ed6bede59694bc631c4b69c041112b1b7dc727fffb23 \
--hash=sha256:f571eab7fec554d3b1db0d666bdc2ad85c81f4b8cb08906c4c59a8cad75e6e22 \
--hash=sha256:f7c50542b4ddceb74ab4f8b3435327a0861f06257ca501d59067a6a482535a77
# via rosbags (setup.cfg)
markupsafe==2.1.2 \
--hash=sha256:0576fe974b40a400449768941d5d0858cc624e3249dfd1e0c33674e5c7ca7aed \
--hash=sha256:085fd3201e7b12809f9e6e9bc1e5c96a368c8523fad5afb02afe3c051ae4afcc \
--hash=sha256:090376d812fb6ac5f171e5938e82e7f2d7adc2b629101cec0db8b267815c85e2 \
--hash=sha256:0b462104ba25f1ac006fdab8b6a01ebbfbce9ed37fd37fd4acd70c67c973e460 \
--hash=sha256:137678c63c977754abe9086a3ec011e8fd985ab90631145dfb9294ad09c102a7 \
--hash=sha256:1bea30e9bf331f3fef67e0a3877b2288593c98a21ccb2cf29b74c581a4eb3af0 \
--hash=sha256:22152d00bf4a9c7c83960521fc558f55a1adbc0631fbb00a9471e097b19d72e1 \
--hash=sha256:22731d79ed2eb25059ae3df1dfc9cb1546691cc41f4e3130fe6bfbc3ecbbecfa \
--hash=sha256:2298c859cfc5463f1b64bd55cb3e602528db6fa0f3cfd568d3605c50678f8f03 \
--hash=sha256:28057e985dace2f478e042eaa15606c7efccb700797660629da387eb289b9323 \
--hash=sha256:2e7821bffe00aa6bd07a23913b7f4e01328c3d5cc0b40b36c0bd81d362faeb65 \
--hash=sha256:2ec4f2d48ae59bbb9d1f9d7efb9236ab81429a764dedca114f5fdabbc3788013 \
--hash=sha256:340bea174e9761308703ae988e982005aedf427de816d1afe98147668cc03036 \
--hash=sha256:40627dcf047dadb22cd25ea7ecfe9cbf3bbbad0482ee5920b582f3809c97654f \
--hash=sha256:40dfd3fefbef579ee058f139733ac336312663c6706d1163b82b3003fb1925c4 \
--hash=sha256:4cf06cdc1dda95223e9d2d3c58d3b178aa5dacb35ee7e3bbac10e4e1faacb419 \
--hash=sha256:50c42830a633fa0cf9e7d27664637532791bfc31c731a87b202d2d8ac40c3ea2 \
--hash=sha256:55f44b440d491028addb3b88f72207d71eeebfb7b5dbf0643f7c023ae1fba619 \
--hash=sha256:608e7073dfa9e38a85d38474c082d4281f4ce276ac0010224eaba11e929dd53a \
--hash=sha256:63ba06c9941e46fa389d389644e2d8225e0e3e5ebcc4ff1ea8506dce646f8c8a \
--hash=sha256:65608c35bfb8a76763f37036547f7adfd09270fbdbf96608be2bead319728fcd \
--hash=sha256:665a36ae6f8f20a4676b53224e33d456a6f5a72657d9c83c2aa00765072f31f7 \
--hash=sha256:6d6607f98fcf17e534162f0709aaad3ab7a96032723d8ac8750ffe17ae5a0666 \
--hash=sha256:7313ce6a199651c4ed9d7e4cfb4aa56fe923b1adf9af3b420ee14e6d9a73df65 \
--hash=sha256:7668b52e102d0ed87cb082380a7e2e1e78737ddecdde129acadb0eccc5423859 \
--hash=sha256:7df70907e00c970c60b9ef2938d894a9381f38e6b9db73c5be35e59d92e06625 \
--hash=sha256:7e007132af78ea9df29495dbf7b5824cb71648d7133cf7848a2a5dd00d36f9ff \
--hash=sha256:835fb5e38fd89328e9c81067fd642b3593c33e1e17e2fdbf77f5676abb14a156 \
--hash=sha256:8bca7e26c1dd751236cfb0c6c72d4ad61d986e9a41bbf76cb445f69488b2a2bd \
--hash=sha256:8db032bf0ce9022a8e41a22598eefc802314e81b879ae093f36ce9ddf39ab1ba \
--hash=sha256:99625a92da8229df6d44335e6fcc558a5037dd0a760e11d84be2260e6f37002f \
--hash=sha256:9cad97ab29dfc3f0249b483412c85c8ef4766d96cdf9dcf5a1e3caa3f3661cf1 \
--hash=sha256:a4abaec6ca3ad8660690236d11bfe28dfd707778e2442b45addd2f086d6ef094 \
--hash=sha256:a6e40afa7f45939ca356f348c8e23048e02cb109ced1eb8420961b2f40fb373a \
--hash=sha256:a6f2fcca746e8d5910e18782f976489939d54a91f9411c32051b4aab2bd7c513 \
--hash=sha256:a806db027852538d2ad7555b203300173dd1b77ba116de92da9afbc3a3be3eed \
--hash=sha256:abcabc8c2b26036d62d4c746381a6f7cf60aafcc653198ad678306986b09450d \
--hash=sha256:b8526c6d437855442cdd3d87eede9c425c4445ea011ca38d937db299382e6fa3 \
--hash=sha256:bb06feb762bade6bf3c8b844462274db0c76acc95c52abe8dbed28ae3d44a147 \
--hash=sha256:c0a33bc9f02c2b17c3ea382f91b4db0e6cde90b63b296422a939886a7a80de1c \
--hash=sha256:c4a549890a45f57f1ebf99c067a4ad0cb423a05544accaf2b065246827ed9603 \
--hash=sha256:ca244fa73f50a800cf8c3ebf7fd93149ec37f5cb9596aa8873ae2c1d23498601 \
--hash=sha256:cf877ab4ed6e302ec1d04952ca358b381a882fbd9d1b07cccbfd61783561f98a \
--hash=sha256:d9d971ec1e79906046aa3ca266de79eac42f1dbf3612a05dc9368125952bd1a1 \
--hash=sha256:da25303d91526aac3672ee6d49a2f3db2d9502a4a60b55519feb1a4c7714e07d \
--hash=sha256:e55e40ff0cc8cc5c07996915ad367fa47da6b3fc091fdadca7f5403239c5fec3 \
--hash=sha256:f03a532d7dee1bed20bc4884194a16160a2de9ffc6354b3878ec9682bb623c54 \
--hash=sha256:f1cd098434e83e656abf198f103a8207a8187c0fc110306691a2e94a78d0abb2 \
--hash=sha256:f2bfb563d0211ce16b63c7cb9395d2c682a23187f54c3d79bfec33e6705473c6 \
--hash=sha256:f8ffb705ffcf5ddd0e80b65ddf7bed7ee4f5a441ea7d3419e861a12eaf41af58
# via jinja2
mccabe==0.7.0 \
--hash=sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325 \
--hash=sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e
# via
# flake8
# pylint
mypy==1.0.1 \
--hash=sha256:0af4f0e20706aadf4e6f8f8dc5ab739089146b83fd53cb4a7e0e850ef3de0bb6 \
--hash=sha256:15b5a824b58c7c822c51bc66308e759243c32631896743f030daf449fe3677f3 \
--hash=sha256:17455cda53eeee0a4adb6371a21dd3dbf465897de82843751cf822605d152c8c \
--hash=sha256:2013226d17f20468f34feddd6aae4635a55f79626549099354ce641bc7d40262 \
--hash=sha256:24189f23dc66f83b839bd1cce2dfc356020dfc9a8bae03978477b15be61b062e \
--hash=sha256:27a0f74a298769d9fdc8498fcb4f2beb86f0564bcdb1a37b58cbbe78e55cf8c0 \
--hash=sha256:28cea5a6392bb43d266782983b5a4216c25544cd7d80be681a155ddcdafd152d \
--hash=sha256:448de661536d270ce04f2d7dddaa49b2fdba6e3bd8a83212164d4174ff43aa65 \
--hash=sha256:48525aec92b47baed9b3380371ab8ab6e63a5aab317347dfe9e55e02aaad22e8 \
--hash=sha256:5bc8d6bd3b274dd3846597855d96d38d947aedba18776aa998a8d46fabdaed76 \
--hash=sha256:5deb252fd42a77add936b463033a59b8e48eb2eaec2976d76b6878d031933fe4 \
--hash=sha256:5f546ac34093c6ce33f6278f7c88f0f147a4849386d3bf3ae193702f4fe31407 \
--hash=sha256:5fdd63e4f50e3538617887e9aee91855368d9fc1dea30da743837b0df7373bc4 \
--hash=sha256:65b122a993d9c81ea0bfde7689b3365318a88bde952e4dfa1b3a8b4ac05d168b \
--hash=sha256:71a808334d3f41ef011faa5a5cd8153606df5fc0b56de5b2e89566c8093a0c9a \
--hash=sha256:920169f0184215eef19294fa86ea49ffd4635dedfdea2b57e45cb4ee85d5ccaf \
--hash=sha256:93a85495fb13dc484251b4c1fd7a5ac370cd0d812bbfc3b39c1bafefe95275d5 \
--hash=sha256:a2948c40a7dd46c1c33765718936669dc1f628f134013b02ff5ac6c7ef6942bf \
--hash=sha256:c6c2ccb7af7154673c591189c3687b013122c5a891bb5651eca3db8e6c6c55bd \
--hash=sha256:c96b8a0c019fe29040d520d9257d8c8f122a7343a8307bf8d6d4a43f5c5bfcc8 \
--hash=sha256:d42a98e76070a365a1d1c220fcac8aa4ada12ae0db679cb4d910fabefc88b994 \
--hash=sha256:dbeb24514c4acbc78d205f85dd0e800f34062efcc1f4a4857c57e4b4b8712bff \
--hash=sha256:e60d0b09f62ae97a94605c3f73fd952395286cf3e3b9e7b97f60b01ddfbbda88 \
--hash=sha256:e64f48c6176e243ad015e995de05af7f22bbe370dbb5b32bd6988438ec873919 \
--hash=sha256:e831662208055b006eef68392a768ff83596035ffd6d846786578ba1714ba8f6 \
--hash=sha256:eda5c8b9949ed411ff752b9a01adda31afe7eae1e53e946dbdf9db23865e66c4
# via rosbags (setup.cfg)
mypy-extensions==1.0.0 \
--hash=sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d \
--hash=sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782
# via mypy
numpy==1.24.2 \
--hash=sha256:003a9f530e880cb2cd177cba1af7220b9aa42def9c4afc2a2fc3ee6be7eb2b22 \
--hash=sha256:150947adbdfeceec4e5926d956a06865c1c690f2fd902efede4ca6fe2e657c3f \
--hash=sha256:2620e8592136e073bd12ee4536149380695fbe9ebeae845b81237f986479ffc9 \
--hash=sha256:2eabd64ddb96a1239791da78fa5f4e1693ae2dadc82a76bc76a14cbb2b966e96 \
--hash=sha256:4173bde9fa2a005c2c6e2ea8ac1618e2ed2c1c6ec8a7657237854d42094123a0 \
--hash=sha256:4199e7cfc307a778f72d293372736223e39ec9ac096ff0a2e64853b866a8e18a \
--hash=sha256:4cecaed30dc14123020f77b03601559fff3e6cd0c048f8b5289f4eeabb0eb281 \
--hash=sha256:557d42778a6869c2162deb40ad82612645e21d79e11c1dc62c6e82a2220ffb04 \
--hash=sha256:63e45511ee4d9d976637d11e6c9864eae50e12dc9598f531c035265991910468 \
--hash=sha256:6524630f71631be2dabe0c541e7675db82651eb998496bbe16bc4f77f0772253 \
--hash=sha256:76807b4063f0002c8532cfeac47a3068a69561e9c8715efdad3c642eb27c0756 \
--hash=sha256:7de8fdde0003f4294655aa5d5f0a89c26b9f22c0a58790c38fae1ed392d44a5a \
--hash=sha256:889b2cc88b837d86eda1b17008ebeb679d82875022200c6e8e4ce6cf549b7acb \
--hash=sha256:92011118955724465fb6853def593cf397b4a1367495e0b59a7e69d40c4eb71d \
--hash=sha256:97cf27e51fa078078c649a51d7ade3c92d9e709ba2bfb97493007103c741f1d0 \
--hash=sha256:9a23f8440561a633204a67fb44617ce2a299beecf3295f0d13c495518908e910 \
--hash=sha256:a51725a815a6188c662fb66fb32077709a9ca38053f0274640293a14fdd22978 \
--hash=sha256:a77d3e1163a7770164404607b7ba3967fb49b24782a6ef85d9b5f54126cc39e5 \
--hash=sha256:adbdce121896fd3a17a77ab0b0b5eedf05a9834a18699db6829a64e1dfccca7f \
--hash=sha256:c29e6bd0ec49a44d7690ecb623a8eac5ab8a923bce0bea6293953992edf3a76a \
--hash=sha256:c72a6b2f4af1adfe193f7beb91ddf708ff867a3f977ef2ec53c0ffb8283ab9f5 \
--hash=sha256:d0a2db9d20117bf523dde15858398e7c0858aadca7c0f088ac0d6edd360e9ad2 \
--hash=sha256:e3ab5d32784e843fc0dd3ab6dcafc67ef806e6b6828dc6af2f689be0eb4d781d \
--hash=sha256:e428c4fbfa085f947b536706a2fc349245d7baa8334f0c5723c56a10595f9b95 \
--hash=sha256:e8d2859428712785e8a8b7d2b3ef0a1d1565892367b32f915c4a4df44d0e64f5 \
--hash=sha256:eef70b4fc1e872ebddc38cddacc87c19a3709c0e3e5d20bf3954c147b1dd941d \
--hash=sha256:f64bb98ac59b3ea3bf74b02f13836eb2e24e48e0ab0145bbda646295769bd780 \
--hash=sha256:f9006288bcf4895917d02583cf3411f98631275bc67cce355a7f39f8c14338fa
# via rosbags (setup.cfg)
packaging==23.0 \
--hash=sha256:714ac14496c3e68c99c29b00845f7a2b85f3bb6f1078fd9f72fd20f0570002b2 \
--hash=sha256:b6ad297f8907de0fa2fe1ccbd26fdaf387f5f47c7275fedf8cce89f99446cf97
# via
# pytest
# sphinx
pep8-naming==0.13.3 \
--hash=sha256:1705f046dfcd851378aac3be1cd1551c7c1e5ff363bacad707d43007877fa971 \
--hash=sha256:1a86b8c71a03337c97181917e2b472f0f5e4ccb06844a0d6f0a33522549e7a80
# via rosbags (setup.cfg)
platformdirs==3.0.0 \
--hash=sha256:8a1228abb1ef82d788f74139988b137e78692984ec7b08eaa6c65f1723af28f9 \
--hash=sha256:b1d5eb14f221506f50d6604a561f4c5786d9e80355219694a1b244bcd96f4567
# via pylint
pluggy==1.0.0 \
--hash=sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159 \
--hash=sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3
# via pytest
pycodestyle==2.10.0 \
--hash=sha256:347187bdb476329d98f695c213d7295a846d1152ff4fe9bacb8a9590b8ee7053 \
--hash=sha256:8a4eaf0d0495c7395bdab3589ac2db602797d76207242c17d470186815706610
# via
# flake8
# flake8-print
pydocstyle==6.3.0 \
--hash=sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019 \
--hash=sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1
# via flake8-docstrings
pyflakes==3.0.1 \
--hash=sha256:ec55bf7fe21fff7f1ad2f7da62363d749e2a470500eab1b555334b67aa1ef8cf \
--hash=sha256:ec8b276a6b60bd80defed25add7e439881c19e64850afd9b346283d4165fd0fd
# via flake8
pygments==2.14.0 \
--hash=sha256:b3ed06a9e8ac9a9aae5a6f5dbe78a8a58655d17b43b93c078f094ddc476ae297 \
--hash=sha256:fa7bd7bd2771287c0de303af8bfdfc731f51bd2c6a47ab69d117138893b82717
# via sphinx
pylint==2.16.2 \
--hash=sha256:13b2c805a404a9bf57d002cd5f054ca4d40b0b87542bdaba5e05321ae8262c84 \
--hash=sha256:ff22dde9c2128cd257c145cfd51adeff0be7df4d80d669055f24a962b351bbe4
# via rosbags (setup.cfg)
pytest==7.2.1 \
--hash=sha256:c7c6ca206e93355074ae32f7403e8ea12163b1163c976fee7d4d84027c162be5 \
--hash=sha256:d45e0952f3727241918b8fd0f376f5ff6b301cc0777c6f9a556935c92d8a7d42
# via
# pytest-cov
# rosbags (setup.cfg)
pytest-cov==4.0.0 \
--hash=sha256:2feb1b751d66a8bd934e5edfa2e961d11309dc37b73b0eabe73b5945fee20f6b \
--hash=sha256:996b79efde6433cdbd0088872dbc5fb3ed7fe1578b68cdbba634f14bb8dd0470
# via rosbags (setup.cfg)
pytz==2022.7.1 \
--hash=sha256:01a0681c4b9684a28304615eba55d1ab31ae00bf68ec157ec3708a8182dbbcd0 \
--hash=sha256:78f4f37d8198e0627c5f1143240bb0206b8691d8d7ac6d78fee88b78733f8c4a
# via babel
requests==2.28.2 \
--hash=sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa \
--hash=sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf
# via sphinx
ruamel-yaml==0.17.21 \
--hash=sha256:742b35d3d665023981bd6d16b3d24248ce5df75fdb4e2924e93a05c1f8b61ca7 \
--hash=sha256:8b7ce697a2f212752a35c1ac414471dc16c424c9573be4926b56ff3f5d23b7af
# via rosbags (setup.cfg)
ruamel-yaml-clib==0.2.7 \
--hash=sha256:045e0626baf1c52e5527bd5db361bc83180faaba2ff586e763d3d5982a876a9e \
--hash=sha256:15910ef4f3e537eea7fe45f8a5d19997479940d9196f357152a09031c5be59f3 \
--hash=sha256:184faeaec61dbaa3cace407cffc5819f7b977e75360e8d5ca19461cd851a5fc5 \
--hash=sha256:1f08fd5a2bea9c4180db71678e850b995d2a5f4537be0e94557668cf0f5f9497 \
--hash=sha256:2aa261c29a5545adfef9296b7e33941f46aa5bbd21164228e833412af4c9c75f \
--hash=sha256:3110a99e0f94a4a3470ff67fc20d3f96c25b13d24c6980ff841e82bafe827cac \
--hash=sha256:3243f48ecd450eddadc2d11b5feb08aca941b5cd98c9b1db14b2fd128be8c697 \
--hash=sha256:370445fd795706fd291ab00c9df38a0caed0f17a6fb46b0f607668ecb16ce763 \
--hash=sha256:40d030e2329ce5286d6b231b8726959ebbe0404c92f0a578c0e2482182e38282 \
--hash=sha256:41d0f1fa4c6830176eef5b276af04c89320ea616655d01327d5ce65e50575c94 \
--hash=sha256:4a4d8d417868d68b979076a9be6a38c676eca060785abaa6709c7b31593c35d1 \
--hash=sha256:4b3a93bb9bc662fc1f99c5c3ea8e623d8b23ad22f861eb6fce9377ac07ad6072 \
--hash=sha256:5bc0667c1eb8f83a3752b71b9c4ba55ef7c7058ae57022dd9b29065186a113d9 \
--hash=sha256:721bc4ba4525f53f6a611ec0967bdcee61b31df5a56801281027a3a6d1c2daf5 \
--hash=sha256:763d65baa3b952479c4e972669f679fe490eee058d5aa85da483ebae2009d231 \
--hash=sha256:7bdb4c06b063f6fd55e472e201317a3bb6cdeeee5d5a38512ea5c01e1acbdd93 \
--hash=sha256:8831a2cedcd0f0927f788c5bdf6567d9dc9cc235646a434986a852af1cb54b4b \
--hash=sha256:91a789b4aa0097b78c93e3dc4b40040ba55bef518f84a40d4442f713b4094acb \
--hash=sha256:92460ce908546ab69770b2e576e4f99fbb4ce6ab4b245345a3869a0a0410488f \
--hash=sha256:99e77daab5d13a48a4054803d052ff40780278240a902b880dd37a51ba01a307 \
--hash=sha256:a234a20ae07e8469da311e182e70ef6b199d0fbeb6c6cc2901204dd87fb867e8 \
--hash=sha256:a7b301ff08055d73223058b5c46c55638917f04d21577c95e00e0c4d79201a6b \
--hash=sha256:be2a7ad8fd8f7442b24323d24ba0b56c51219513cfa45b9ada3b87b76c374d4b \
--hash=sha256:bf9a6bc4a0221538b1a7de3ed7bca4c93c02346853f44e1cd764be0023cd3640 \
--hash=sha256:c3ca1fbba4ae962521e5eb66d72998b51f0f4d0f608d3c0347a48e1af262efa7 \
--hash=sha256:d000f258cf42fec2b1bbf2863c61d7b8918d31ffee905da62dede869254d3b8a \
--hash=sha256:d5859983f26d8cd7bb5c287ef452e8aacc86501487634573d260968f753e1d71 \
--hash=sha256:d5e51e2901ec2366b79f16c2299a03e74ba4531ddcfacc1416639c557aef0ad8 \
--hash=sha256:da538167284de58a52109a9b89b8f6a53ff8437dd6dc26d33b57bf6699153122 \
--hash=sha256:debc87a9516b237d0466a711b18b6ebeb17ba9f391eb7f91c649c5c4ec5006c7 \
--hash=sha256:df5828871e6648db72d1c19b4bd24819b80a755c4541d3409f0f7acd0f335c80 \
--hash=sha256:ecdf1a604009bd35c674b9225a8fa609e0282d9b896c03dd441a91e5f53b534e \
--hash=sha256:efa08d63ef03d079dcae1dfe334f6c8847ba8b645d08df286358b1f5293d24ab \
--hash=sha256:f01da5790e95815eb5a8a138508c01c758e5f5bc0ce4286c4f7028b8dd7ac3d0 \
--hash=sha256:f34019dced51047d6f70cb9383b2ae2853b7fc4dce65129a5acd49f4f9256646 \
--hash=sha256:f6d3d39611ac2e4f62c3128a9eed45f19a6608670c5a2f4f07f24e8de3441d38
# via ruamel-yaml
snowballstemmer==2.2.0 \
--hash=sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1 \
--hash=sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a
# via
# pydocstyle
# sphinx
sphinx==6.1.3 \
--hash=sha256:0dac3b698538ffef41716cf97ba26c1c7788dba73ce6f150c1ff5b4720786dd2 \
--hash=sha256:807d1cb3d6be87eb78a381c3e70ebd8d346b9a25f3753e9947e866b2786865fc
# via
# rosbags (setup.cfg)
# sphinx-autodoc-typehints
# sphinx-rtd-theme
sphinx-autodoc-typehints==1.22 \
--hash=sha256:71fca2d5eee9b034204e4c686ab20b4d8f5eb9409396216bcae6c87c38e18ea6 \
--hash=sha256:ef4a8b9d52de66065aa7d3adfabf5a436feb8a2eff07c2ddc31625d8807f2b69
# via rosbags (setup.cfg)
sphinx-rtd-theme==1.2.0 \
--hash=sha256:a0d8bd1a2ed52e0b338cbe19c4b2eef3c5e7a048769753dac6a9f059c7b641b8 \
--hash=sha256:f823f7e71890abe0ac6aaa6013361ea2696fc8d3e1fa798f463e82bdb77eeff2
# via rosbags (setup.cfg)
sphinxcontrib-applehelp==1.0.4 \
--hash=sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228 \
--hash=sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e
# via sphinx
sphinxcontrib-devhelp==1.0.2 \
--hash=sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e \
--hash=sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4
# via sphinx
sphinxcontrib-htmlhelp==2.0.1 \
--hash=sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff \
--hash=sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903
# via sphinx
sphinxcontrib-jquery==2.0.0 \
--hash=sha256:8fb65f6dba84bf7bcd1aea1f02ab3955ac34611d838bcc95d4983b805b234daa \
--hash=sha256:ed47fa425c338ffebe3c37e1cdb56e30eb806116b85f01055b158c7057fdb995
# via sphinx-rtd-theme
sphinxcontrib-jsmath==1.0.1 \
--hash=sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178 \
--hash=sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8
# via sphinx
sphinxcontrib-qthelp==1.0.3 \
--hash=sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72 \
--hash=sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6
# via sphinx
sphinxcontrib-serializinghtml==1.1.5 \
--hash=sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd \
--hash=sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952
# via sphinx
toml==0.10.2 \
--hash=sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b \
--hash=sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f
# via rosbags (setup.cfg)
tomli==2.0.1 \
--hash=sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc \
--hash=sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f
# via
# coverage
# flake8-pyprojecttoml
# mypy
# pylint
# pytest
tomlkit==0.11.6 \
--hash=sha256:07de26b0d8cfc18f871aec595fda24d95b08fef89d147caa861939f37230bf4b \
--hash=sha256:71b952e5721688937fb02cf9d354dbcf0785066149d2855e44531ebdd2b65d73
# via pylint
typing-extensions==4.5.0 \
--hash=sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb \
--hash=sha256:fb33085c39dd998ac16d1431ebc293a8b3eedd00fd4a32de0ff79002c19511b4
# via
# astroid
# mypy
# pylint
urllib3==1.26.14 \
--hash=sha256:076907bf8fd355cde77728471316625a4d2f7e713c125f51953bb5b3eecf4f72 \
--hash=sha256:75edcdc2f7d85b137124a6c3c9fc3933cdeaa12ecb9a6a959f22797a0feca7e1
# via requests
wrapt==1.15.0 \
--hash=sha256:02fce1852f755f44f95af51f69d22e45080102e9d00258053b79367d07af39c0 \
--hash=sha256:077ff0d1f9d9e4ce6476c1a924a3332452c1406e59d90a2cf24aeb29eeac9420 \
--hash=sha256:078e2a1a86544e644a68422f881c48b84fef6d18f8c7a957ffd3f2e0a74a0d4a \
--hash=sha256:0970ddb69bba00670e58955f8019bec4a42d1785db3faa043c33d81de2bf843c \
--hash=sha256:1286eb30261894e4c70d124d44b7fd07825340869945c79d05bda53a40caa079 \
--hash=sha256:21f6d9a0d5b3a207cdf7acf8e58d7d13d463e639f0c7e01d82cdb671e6cb7923 \
--hash=sha256:230ae493696a371f1dbffaad3dafbb742a4d27a0afd2b1aecebe52b740167e7f \
--hash=sha256:26458da5653aa5b3d8dc8b24192f574a58984c749401f98fff994d41d3f08da1 \
--hash=sha256:2cf56d0e237280baed46f0b5316661da892565ff58309d4d2ed7dba763d984b8 \
--hash=sha256:2e51de54d4fb8fb50d6ee8327f9828306a959ae394d3e01a1ba8b2f937747d86 \
--hash=sha256:2fbfbca668dd15b744418265a9607baa970c347eefd0db6a518aaf0cfbd153c0 \
--hash=sha256:38adf7198f8f154502883242f9fe7333ab05a5b02de7d83aa2d88ea621f13364 \
--hash=sha256:3a8564f283394634a7a7054b7983e47dbf39c07712d7b177b37e03f2467a024e \
--hash=sha256:3abbe948c3cbde2689370a262a8d04e32ec2dd4f27103669a45c6929bcdbfe7c \
--hash=sha256:3bbe623731d03b186b3d6b0d6f51865bf598587c38d6f7b0be2e27414f7f214e \
--hash=sha256:40737a081d7497efea35ab9304b829b857f21558acfc7b3272f908d33b0d9d4c \
--hash=sha256:41d07d029dd4157ae27beab04d22b8e261eddfc6ecd64ff7000b10dc8b3a5727 \
--hash=sha256:46ed616d5fb42f98630ed70c3529541408166c22cdfd4540b88d5f21006b0eff \
--hash=sha256:493d389a2b63c88ad56cdc35d0fa5752daac56ca755805b1b0c530f785767d5e \
--hash=sha256:4ff0d20f2e670800d3ed2b220d40984162089a6e2c9646fdb09b85e6f9a8fc29 \
--hash=sha256:54accd4b8bc202966bafafd16e69da9d5640ff92389d33d28555c5fd4f25ccb7 \
--hash=sha256:56374914b132c702aa9aa9959c550004b8847148f95e1b824772d453ac204a72 \
--hash=sha256:578383d740457fa790fdf85e6d346fda1416a40549fe8db08e5e9bd281c6a475 \
--hash=sha256:58d7a75d731e8c63614222bcb21dd992b4ab01a399f1f09dd82af17bbfc2368a \
--hash=sha256:5c5aa28df055697d7c37d2099a7bc09f559d5053c3349b1ad0c39000e611d317 \
--hash=sha256:5fc8e02f5984a55d2c653f5fea93531e9836abbd84342c1d1e17abc4a15084c2 \
--hash=sha256:63424c681923b9f3bfbc5e3205aafe790904053d42ddcc08542181a30a7a51bd \
--hash=sha256:64b1df0f83706b4ef4cfb4fb0e4c2669100fd7ecacfb59e091fad300d4e04640 \
--hash=sha256:74934ebd71950e3db69960a7da29204f89624dde411afbfb3b4858c1409b1e98 \
--hash=sha256:75669d77bb2c071333417617a235324a1618dba66f82a750362eccbe5b61d248 \
--hash=sha256:75760a47c06b5974aa5e01949bf7e66d2af4d08cb8c1d6516af5e39595397f5e \
--hash=sha256:76407ab327158c510f44ded207e2f76b657303e17cb7a572ffe2f5a8a48aa04d \
--hash=sha256:76e9c727a874b4856d11a32fb0b389afc61ce8aaf281ada613713ddeadd1cfec \
--hash=sha256:77d4c1b881076c3ba173484dfa53d3582c1c8ff1f914c6461ab70c8428b796c1 \
--hash=sha256:780c82a41dc493b62fc5884fb1d3a3b81106642c5c5c78d6a0d4cbe96d62ba7e \
--hash=sha256:7dc0713bf81287a00516ef43137273b23ee414fe41a3c14be10dd95ed98a2df9 \
--hash=sha256:7eebcdbe3677e58dd4c0e03b4f2cfa346ed4049687d839adad68cc38bb559c92 \
--hash=sha256:896689fddba4f23ef7c718279e42f8834041a21342d95e56922e1c10c0cc7afb \
--hash=sha256:96177eb5645b1c6985f5c11d03fc2dbda9ad24ec0f3a46dcce91445747e15094 \
--hash=sha256:96e25c8603a155559231c19c0349245eeb4ac0096fe3c1d0be5c47e075bd4f46 \
--hash=sha256:9d37ac69edc5614b90516807de32d08cb8e7b12260a285ee330955604ed9dd29 \
--hash=sha256:9ed6aa0726b9b60911f4aed8ec5b8dd7bf3491476015819f56473ffaef8959bd \
--hash=sha256:a487f72a25904e2b4bbc0817ce7a8de94363bd7e79890510174da9d901c38705 \
--hash=sha256:a4cbb9ff5795cd66f0066bdf5947f170f5d63a9274f99bdbca02fd973adcf2a8 \
--hash=sha256:a74d56552ddbde46c246b5b89199cb3fd182f9c346c784e1a93e4dc3f5ec9975 \
--hash=sha256:a89ce3fd220ff144bd9d54da333ec0de0399b52c9ac3d2ce34b569cf1a5748fb \
--hash=sha256:abd52a09d03adf9c763d706df707c343293d5d106aea53483e0ec8d9e310ad5e \
--hash=sha256:abd8f36c99512755b8456047b7be10372fca271bf1467a1caa88db991e7c421b \
--hash=sha256:af5bd9ccb188f6a5fdda9f1f09d9f4c86cc8a539bd48a0bfdc97723970348418 \
--hash=sha256:b02f21c1e2074943312d03d243ac4388319f2456576b2c6023041c4d57cd7019 \
--hash=sha256:b06fa97478a5f478fb05e1980980a7cdf2712015493b44d0c87606c1513ed5b1 \
--hash=sha256:b0724f05c396b0a4c36a3226c31648385deb6a65d8992644c12a4963c70326ba \
--hash=sha256:b130fe77361d6771ecf5a219d8e0817d61b236b7d8b37cc045172e574ed219e6 \
--hash=sha256:b56d5519e470d3f2fe4aa7585f0632b060d532d0696c5bdfb5e8319e1d0f69a2 \
--hash=sha256:b67b819628e3b748fd3c2192c15fb951f549d0f47c0449af0764d7647302fda3 \
--hash=sha256:ba1711cda2d30634a7e452fc79eabcadaffedf241ff206db2ee93dd2c89a60e7 \
--hash=sha256:bbeccb1aa40ab88cd29e6c7d8585582c99548f55f9b2581dfc5ba68c59a85752 \
--hash=sha256:bd84395aab8e4d36263cd1b9308cd504f6cf713b7d6d3ce25ea55670baec5416 \
--hash=sha256:c99f4309f5145b93eca6e35ac1a988f0dc0a7ccf9ccdcd78d3c0adf57224e62f \
--hash=sha256:ca1cccf838cd28d5a0883b342474c630ac48cac5df0ee6eacc9c7290f76b11c1 \
--hash=sha256:cd525e0e52a5ff16653a3fc9e3dd827981917d34996600bbc34c05d048ca35cc \
--hash=sha256:cdb4f085756c96a3af04e6eca7f08b1345e94b53af8921b25c72f096e704e145 \
--hash=sha256:ce42618f67741d4697684e501ef02f29e758a123aa2d669e2d964ff734ee00ee \
--hash=sha256:d06730c6aed78cee4126234cf2d071e01b44b915e725a6cb439a879ec9754a3a \
--hash=sha256:d5fe3e099cf07d0fb5a1e23d399e5d4d1ca3e6dfcbe5c8570ccff3e9208274f7 \
--hash=sha256:d6bcbfc99f55655c3d93feb7ef3800bd5bbe963a755687cbf1f490a71fb7794b \
--hash=sha256:d787272ed958a05b2c86311d3a4135d3c2aeea4fc655705f074130aa57d71653 \
--hash=sha256:e169e957c33576f47e21864cf3fc9ff47c223a4ebca8960079b8bd36cb014fd0 \
--hash=sha256:e20076a211cd6f9b44a6be58f7eeafa7ab5720eb796975d0c03f05b47d89eb90 \
--hash=sha256:e826aadda3cae59295b95343db8f3d965fb31059da7de01ee8d1c40a60398b29 \
--hash=sha256:eef4d64c650f33347c1f9266fa5ae001440b232ad9b98f1f43dfe7a79435c0a6 \
--hash=sha256:f2e69b3ed24544b0d3dbe2c5c0ba5153ce50dcebb576fdc4696d52aa22db6034 \
--hash=sha256:f87ec75864c37c4c6cb908d282e1969e79763e0d9becdfe9fe5473b7bb1e5f09 \
--hash=sha256:fbec11614dba0424ca72f4e8ba3c420dba07b4a7c206c8c8e4e73f2e98f4c559 \
--hash=sha256:fd69666217b62fa5d7c6aa88e507493a34dec4fa20c5bd925e4bc12fce586639
# via astroid
yapf==0.32.0 \
--hash=sha256:8fea849025584e486fd06d6ba2bed717f396080fd3cc236ba10cb97c4c51cf32 \
--hash=sha256:a3f5085d37ef7e3e004c4ba9f9b3e40c54ff1901cd111f05145ae313a7c67d1b
# via rosbags (setup.cfg)
zipp==3.15.0 \
--hash=sha256:112929ad649da941c23de50f356a2b5570c954b65150642bccdd66bf194d224b \
--hash=sha256:48904fc76a60e542af151aded95726c1a5c34ed43ab4134b597665c86d7ad556
# via importlib-metadata
zstandard==0.20.0 \
--hash=sha256:0488f2a238b4560828b3a595f3337daac4d3725c2a1637ffe2a0d187c091da59 \
--hash=sha256:059316f07e39b7214cd9eed565d26ab239035d2c76835deeff381995f7a27ba8 \
--hash=sha256:0aa4d178560d7ee32092ddfd415c2cdc6ab5ddce9554985c75f1a019a0ff4c55 \
--hash=sha256:0b815dec62e2d5a1bf7a373388f2616f21a27047b9b999de328bca7462033708 \
--hash=sha256:0d213353d58ad37fb5070314b156fb983b4d680ed5f3fce76ab013484cf3cf12 \
--hash=sha256:0f32a8f3a697ef87e67c0d0c0673b245babee6682b2c95e46eb30208ffb720bd \
--hash=sha256:29699746fae2760d3963a4ffb603968e77da55150ee0a3326c0569f4e35f319f \
--hash=sha256:2adf65cfce73ce94ef4c482f6cc01f08ddf5e1ca0c1ec95f2b63840f9e4c226c \
--hash=sha256:2eeb9e1ecd48ac1d352608bfe0dc1ed78a397698035a1796cf72f0c9d905d219 \
--hash=sha256:302a31400de0280f17c4ce67a73444a7a069f228db64048e4ce555cd0c02fbc4 \
--hash=sha256:39ae788dcdc404c07ef7aac9b11925185ea0831b985db0bbc43f95acdbd1c2ce \
--hash=sha256:39cbaf8fe3fa3515d35fb790465db4dc1ff45e58e1e00cbaf8b714e85437f039 \
--hash=sha256:40466adfa071f58bfa448d90f9623d6aff67c6d86de6fc60be47a26388f6c74d \
--hash=sha256:489959e2d52f7f1fe8ea275fecde6911d454df465265bf3ec51b3e755e769a5e \
--hash=sha256:4a3c36284c219a4d2694e52b2582fe5d5f0ecaf94a22cf0ea959b527dbd8a2a6 \
--hash=sha256:4abf9a9e0841b844736d1ae8ead2b583d2cd212815eab15391b702bde17477a7 \
--hash=sha256:4af5d1891eebef430038ea4981957d31b1eb70aca14b906660c3ac1c3e7a8612 \
--hash=sha256:5499d65d4a1978dccf0a9c2c0d12415e16d4995ffad7a0bc4f72cc66691cf9f2 \
--hash=sha256:5a3578b182c21b8af3c49619eb4cd0b9127fa60791e621b34217d65209722002 \
--hash=sha256:613daadd72c71b1488742cafb2c3b381c39d0c9bb8c6cc157aa2d5ea45cc2efc \
--hash=sha256:6179808ebd1ebc42b1e2f221a23c28a22d3bc8f79209ae4a3cc114693c380bff \
--hash=sha256:7041efe3a93d0975d2ad16451720932e8a3d164be8521bfd0873b27ac917b77a \
--hash=sha256:78fb35d07423f25efd0fc90d0d4710ae83cfc86443a32192b0c6cb8475ec79a5 \
--hash=sha256:79c3058ccbe1fa37356a73c9d3c0475ec935ab528f5b76d56fc002a5a23407c7 \
--hash=sha256:84c1dae0c0a21eea245b5691286fe6470dc797d5e86e0c26b57a3afd1e750b48 \
--hash=sha256:862ad0a5c94670f2bd6f64fff671bd2045af5f4ed428a3f2f69fa5e52483f86a \
--hash=sha256:9aca916724d0802d3e70dc68adeff893efece01dffe7252ee3ae0053f1f1990f \
--hash=sha256:9aea3c7bab4276212e5ac63d28e6bd72a79ff058d57e06926dfe30a52451d943 \
--hash=sha256:a56036c08645aa6041d435a50103428f0682effdc67f5038de47cea5e4221d6f \
--hash=sha256:a5efe366bf0545a1a5a917787659b445ba16442ae4093f102204f42a9da1ecbc \
--hash=sha256:afbcd2ed0c1145e24dd3df8440a429688a1614b83424bc871371b176bed429f9 \
--hash=sha256:b07f391fd85e3d07514c05fb40c5573b398d0063ab2bada6eb09949ec6004772 \
--hash=sha256:b0f556c74c6f0f481b61d917e48c341cdfbb80cc3391511345aed4ce6fb52fdc \
--hash=sha256:b671b75ae88139b1dd022fa4aa66ba419abd66f98869af55a342cb9257a1831e \
--hash=sha256:b6d718f1b7cd30adb02c2a46dde0f25a84a9de8865126e0fff7d0162332d6b92 \
--hash=sha256:ba4bb4c5a0cac802ff485fa1e57f7763df5efa0ad4ee10c2693ecc5a018d2c1a \
--hash=sha256:ba86f931bf925e9561ccd6cb978acb163e38c425990927feb38be10c894fa937 \
--hash=sha256:c1929afea64da48ec59eca9055d7ec7e5955801489ac40ac2a19dde19e7edad9 \
--hash=sha256:c28c7441638c472bfb794f424bd560a22c7afce764cd99196e8d70fbc4d14e85 \
--hash=sha256:c4efa051799703dc37c072e22af1f0e4c77069a78fb37caf70e26414c738ca1d \
--hash=sha256:cc98c8bcaa07150d3f5d7c4bd264eaa4fdd4a4dfb8fd3f9d62565ae5c4aba227 \
--hash=sha256:cd0aa9a043c38901925ae1bba49e1e638f2d9c3cdf1b8000868993c642deb7f2 \
--hash=sha256:cdd769da7add8498658d881ce0eeb4c35ea1baac62e24c5a030c50f859f29724 \
--hash=sha256:d08459f7f7748398a6cc65eb7f88aa7ef5731097be2ddfba544be4b558acd900 \
--hash=sha256:dc47cec184e66953f635254e5381df8a22012a2308168c069230b1a95079ccd0 \
--hash=sha256:e3f6887d2bdfb5752d5544860bd6b778e53ebfaf4ab6c3f9d7fd388445429d41 \
--hash=sha256:e6b4de1ba2f3028fafa0d82222d1e91b729334c8d65fbf04290c65c09d7457e1 \
--hash=sha256:ee2a1510e06dfc7706ea9afad363efe222818a1eafa59abc32d9bbcd8465fba7 \
--hash=sha256:f199d58f3fd7dfa0d447bc255ff22571f2e4e5e5748bfd1c41370454723cb053 \
--hash=sha256:f1ba6bbd28ad926d130f0af8016f3a2930baa013c2128cfff46ca76432f50669 \
--hash=sha256:f847701d77371d90783c0ce6cfdb7ebde4053882c2aaba7255c70ae3c3eb7af0
# via rosbags (setup.cfg)
# WARNING: The following packages were not pinned, but pip requires them to be
# pinned when the requirements file includes hashes. Consider using the --allow-unsafe flag.
# setuptools

168
rosbags/requirements.txt Normal file
View File

@ -0,0 +1,168 @@
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --generate-hashes setup.cfg
#
lz4==4.3.2 \
--hash=sha256:0ca83a623c449295bafad745dcd399cea4c55b16b13ed8cfea30963b004016c9 \
--hash=sha256:0f5614d8229b33d4a97cb527db2a1ac81308c6e796e7bdb5d1309127289f69d5 \
--hash=sha256:1c4c100d99eed7c08d4e8852dd11e7d1ec47a3340f49e3a96f8dfbba17ffb300 \
--hash=sha256:1f25eb322eeb24068bb7647cae2b0732b71e5c639e4e4026db57618dcd8279f0 \
--hash=sha256:200d05777d61ba1ff8d29cb51c534a162ea0b4fe6d3c28be3571a0a48ff36080 \
--hash=sha256:31d72731c4ac6ebdce57cd9a5cabe0aecba229c4f31ba3e2c64ae52eee3fdb1c \
--hash=sha256:3a85b430138882f82f354135b98c320dafb96fc8fe4656573d95ab05de9eb092 \
--hash=sha256:4931ab28a0d1c133104613e74eec1b8bb1f52403faabe4f47f93008785c0b929 \
--hash=sha256:4caedeb19e3ede6c7a178968b800f910db6503cb4cb1e9cc9221157572139b49 \
--hash=sha256:65d5c93f8badacfa0456b660285e394e65023ef8071142e0dcbd4762166e1be0 \
--hash=sha256:6b50f096a6a25f3b2edca05aa626ce39979d63c3b160687c8c6d50ac3943d0ba \
--hash=sha256:7211dc8f636ca625abc3d4fb9ab74e5444b92df4f8d58ec83c8868a2b0ff643d \
--hash=sha256:7a9eec24ec7d8c99aab54de91b4a5a149559ed5b3097cf30249b665689b3d402 \
--hash=sha256:7c2df117def1589fba1327dceee51c5c2176a2b5a7040b45e84185ce0c08b6a3 \
--hash=sha256:7e2dc1bd88b60fa09b9b37f08553f45dc2b770c52a5996ea52b2b40f25445676 \
--hash=sha256:83903fe6db92db0be101acedc677aa41a490b561567fe1b3fe68695b2110326c \
--hash=sha256:83acfacab3a1a7ab9694333bcb7950fbeb0be21660d236fd09c8337a50817897 \
--hash=sha256:86480f14a188c37cb1416cdabacfb4e42f7a5eab20a737dac9c4b1c227f3b822 \
--hash=sha256:867664d9ca9bdfce840ac96d46cd8838c9ae891e859eb98ce82fcdf0e103a947 \
--hash=sha256:8df16c9a2377bdc01e01e6de5a6e4bbc66ddf007a6b045688e285d7d9d61d1c9 \
--hash=sha256:8f00a9ba98f6364cadda366ae6469b7b3568c0cced27e16a47ddf6b774169270 \
--hash=sha256:926b26db87ec8822cf1870efc3d04d06062730ec3279bbbd33ba47a6c0a5c673 \
--hash=sha256:a6a46889325fd60b8a6b62ffc61588ec500a1883db32cddee9903edfba0b7584 \
--hash=sha256:a98b61e504fb69f99117b188e60b71e3c94469295571492a6468c1acd63c37ba \
--hash=sha256:ad38dc6a7eea6f6b8b642aaa0683253288b0460b70cab3216838747163fb774d \
--hash=sha256:b10b77dc2e6b1daa2f11e241141ab8285c42b4ed13a8642495620416279cc5b2 \
--hash=sha256:d5ea0e788dc7e2311989b78cae7accf75a580827b4d96bbaf06c7e5a03989bd5 \
--hash=sha256:e05afefc4529e97c08e65ef92432e5f5225c0bb21ad89dee1e06a882f91d7f5e \
--hash=sha256:e1431d84a9cfb23e6773e72078ce8e65cad6745816d4cbf9ae67da5ea419acda \
--hash=sha256:ec6755cacf83f0c5588d28abb40a1ac1643f2ff2115481089264c7630236618a \
--hash=sha256:edc2fb3463d5d9338ccf13eb512aab61937be50aa70734bcf873f2f493801d3b \
--hash=sha256:edd8987d8415b5dad25e797043936d91535017237f72fa456601be1479386c92 \
--hash=sha256:edda4fb109439b7f3f58ed6bede59694bc631c4b69c041112b1b7dc727fffb23 \
--hash=sha256:f571eab7fec554d3b1db0d666bdc2ad85c81f4b8cb08906c4c59a8cad75e6e22 \
--hash=sha256:f7c50542b4ddceb74ab4f8b3435327a0861f06257ca501d59067a6a482535a77
# via rosbags (setup.cfg)
numpy==1.24.2 \
--hash=sha256:003a9f530e880cb2cd177cba1af7220b9aa42def9c4afc2a2fc3ee6be7eb2b22 \
--hash=sha256:150947adbdfeceec4e5926d956a06865c1c690f2fd902efede4ca6fe2e657c3f \
--hash=sha256:2620e8592136e073bd12ee4536149380695fbe9ebeae845b81237f986479ffc9 \
--hash=sha256:2eabd64ddb96a1239791da78fa5f4e1693ae2dadc82a76bc76a14cbb2b966e96 \
--hash=sha256:4173bde9fa2a005c2c6e2ea8ac1618e2ed2c1c6ec8a7657237854d42094123a0 \
--hash=sha256:4199e7cfc307a778f72d293372736223e39ec9ac096ff0a2e64853b866a8e18a \
--hash=sha256:4cecaed30dc14123020f77b03601559fff3e6cd0c048f8b5289f4eeabb0eb281 \
--hash=sha256:557d42778a6869c2162deb40ad82612645e21d79e11c1dc62c6e82a2220ffb04 \
--hash=sha256:63e45511ee4d9d976637d11e6c9864eae50e12dc9598f531c035265991910468 \
--hash=sha256:6524630f71631be2dabe0c541e7675db82651eb998496bbe16bc4f77f0772253 \
--hash=sha256:76807b4063f0002c8532cfeac47a3068a69561e9c8715efdad3c642eb27c0756 \
--hash=sha256:7de8fdde0003f4294655aa5d5f0a89c26b9f22c0a58790c38fae1ed392d44a5a \
--hash=sha256:889b2cc88b837d86eda1b17008ebeb679d82875022200c6e8e4ce6cf549b7acb \
--hash=sha256:92011118955724465fb6853def593cf397b4a1367495e0b59a7e69d40c4eb71d \
--hash=sha256:97cf27e51fa078078c649a51d7ade3c92d9e709ba2bfb97493007103c741f1d0 \
--hash=sha256:9a23f8440561a633204a67fb44617ce2a299beecf3295f0d13c495518908e910 \
--hash=sha256:a51725a815a6188c662fb66fb32077709a9ca38053f0274640293a14fdd22978 \
--hash=sha256:a77d3e1163a7770164404607b7ba3967fb49b24782a6ef85d9b5f54126cc39e5 \
--hash=sha256:adbdce121896fd3a17a77ab0b0b5eedf05a9834a18699db6829a64e1dfccca7f \
--hash=sha256:c29e6bd0ec49a44d7690ecb623a8eac5ab8a923bce0bea6293953992edf3a76a \
--hash=sha256:c72a6b2f4af1adfe193f7beb91ddf708ff867a3f977ef2ec53c0ffb8283ab9f5 \
--hash=sha256:d0a2db9d20117bf523dde15858398e7c0858aadca7c0f088ac0d6edd360e9ad2 \
--hash=sha256:e3ab5d32784e843fc0dd3ab6dcafc67ef806e6b6828dc6af2f689be0eb4d781d \
--hash=sha256:e428c4fbfa085f947b536706a2fc349245d7baa8334f0c5723c56a10595f9b95 \
--hash=sha256:e8d2859428712785e8a8b7d2b3ef0a1d1565892367b32f915c4a4df44d0e64f5 \
--hash=sha256:eef70b4fc1e872ebddc38cddacc87c19a3709c0e3e5d20bf3954c147b1dd941d \
--hash=sha256:f64bb98ac59b3ea3bf74b02f13836eb2e24e48e0ab0145bbda646295769bd780 \
--hash=sha256:f9006288bcf4895917d02583cf3411f98631275bc67cce355a7f39f8c14338fa
# via rosbags (setup.cfg)
ruamel-yaml==0.17.21 \
--hash=sha256:742b35d3d665023981bd6d16b3d24248ce5df75fdb4e2924e93a05c1f8b61ca7 \
--hash=sha256:8b7ce697a2f212752a35c1ac414471dc16c424c9573be4926b56ff3f5d23b7af
# via rosbags (setup.cfg)
ruamel-yaml-clib==0.2.7 \
--hash=sha256:045e0626baf1c52e5527bd5db361bc83180faaba2ff586e763d3d5982a876a9e \
--hash=sha256:15910ef4f3e537eea7fe45f8a5d19997479940d9196f357152a09031c5be59f3 \
--hash=sha256:184faeaec61dbaa3cace407cffc5819f7b977e75360e8d5ca19461cd851a5fc5 \
--hash=sha256:1f08fd5a2bea9c4180db71678e850b995d2a5f4537be0e94557668cf0f5f9497 \
--hash=sha256:2aa261c29a5545adfef9296b7e33941f46aa5bbd21164228e833412af4c9c75f \
--hash=sha256:3110a99e0f94a4a3470ff67fc20d3f96c25b13d24c6980ff841e82bafe827cac \
--hash=sha256:3243f48ecd450eddadc2d11b5feb08aca941b5cd98c9b1db14b2fd128be8c697 \
--hash=sha256:370445fd795706fd291ab00c9df38a0caed0f17a6fb46b0f607668ecb16ce763 \
--hash=sha256:40d030e2329ce5286d6b231b8726959ebbe0404c92f0a578c0e2482182e38282 \
--hash=sha256:41d0f1fa4c6830176eef5b276af04c89320ea616655d01327d5ce65e50575c94 \
--hash=sha256:4a4d8d417868d68b979076a9be6a38c676eca060785abaa6709c7b31593c35d1 \
--hash=sha256:4b3a93bb9bc662fc1f99c5c3ea8e623d8b23ad22f861eb6fce9377ac07ad6072 \
--hash=sha256:5bc0667c1eb8f83a3752b71b9c4ba55ef7c7058ae57022dd9b29065186a113d9 \
--hash=sha256:721bc4ba4525f53f6a611ec0967bdcee61b31df5a56801281027a3a6d1c2daf5 \
--hash=sha256:763d65baa3b952479c4e972669f679fe490eee058d5aa85da483ebae2009d231 \
--hash=sha256:7bdb4c06b063f6fd55e472e201317a3bb6cdeeee5d5a38512ea5c01e1acbdd93 \
--hash=sha256:8831a2cedcd0f0927f788c5bdf6567d9dc9cc235646a434986a852af1cb54b4b \
--hash=sha256:91a789b4aa0097b78c93e3dc4b40040ba55bef518f84a40d4442f713b4094acb \
--hash=sha256:92460ce908546ab69770b2e576e4f99fbb4ce6ab4b245345a3869a0a0410488f \
--hash=sha256:99e77daab5d13a48a4054803d052ff40780278240a902b880dd37a51ba01a307 \
--hash=sha256:a234a20ae07e8469da311e182e70ef6b199d0fbeb6c6cc2901204dd87fb867e8 \
--hash=sha256:a7b301ff08055d73223058b5c46c55638917f04d21577c95e00e0c4d79201a6b \
--hash=sha256:be2a7ad8fd8f7442b24323d24ba0b56c51219513cfa45b9ada3b87b76c374d4b \
--hash=sha256:bf9a6bc4a0221538b1a7de3ed7bca4c93c02346853f44e1cd764be0023cd3640 \
--hash=sha256:c3ca1fbba4ae962521e5eb66d72998b51f0f4d0f608d3c0347a48e1af262efa7 \
--hash=sha256:d000f258cf42fec2b1bbf2863c61d7b8918d31ffee905da62dede869254d3b8a \
--hash=sha256:d5859983f26d8cd7bb5c287ef452e8aacc86501487634573d260968f753e1d71 \
--hash=sha256:d5e51e2901ec2366b79f16c2299a03e74ba4531ddcfacc1416639c557aef0ad8 \
--hash=sha256:da538167284de58a52109a9b89b8f6a53ff8437dd6dc26d33b57bf6699153122 \
--hash=sha256:debc87a9516b237d0466a711b18b6ebeb17ba9f391eb7f91c649c5c4ec5006c7 \
--hash=sha256:df5828871e6648db72d1c19b4bd24819b80a755c4541d3409f0f7acd0f335c80 \
--hash=sha256:ecdf1a604009bd35c674b9225a8fa609e0282d9b896c03dd441a91e5f53b534e \
--hash=sha256:efa08d63ef03d079dcae1dfe334f6c8847ba8b645d08df286358b1f5293d24ab \
--hash=sha256:f01da5790e95815eb5a8a138508c01c758e5f5bc0ce4286c4f7028b8dd7ac3d0 \
--hash=sha256:f34019dced51047d6f70cb9383b2ae2853b7fc4dce65129a5acd49f4f9256646 \
--hash=sha256:f6d3d39611ac2e4f62c3128a9eed45f19a6608670c5a2f4f07f24e8de3441d38
# via ruamel-yaml
zstandard==0.20.0 \
--hash=sha256:0488f2a238b4560828b3a595f3337daac4d3725c2a1637ffe2a0d187c091da59 \
--hash=sha256:059316f07e39b7214cd9eed565d26ab239035d2c76835deeff381995f7a27ba8 \
--hash=sha256:0aa4d178560d7ee32092ddfd415c2cdc6ab5ddce9554985c75f1a019a0ff4c55 \
--hash=sha256:0b815dec62e2d5a1bf7a373388f2616f21a27047b9b999de328bca7462033708 \
--hash=sha256:0d213353d58ad37fb5070314b156fb983b4d680ed5f3fce76ab013484cf3cf12 \
--hash=sha256:0f32a8f3a697ef87e67c0d0c0673b245babee6682b2c95e46eb30208ffb720bd \
--hash=sha256:29699746fae2760d3963a4ffb603968e77da55150ee0a3326c0569f4e35f319f \
--hash=sha256:2adf65cfce73ce94ef4c482f6cc01f08ddf5e1ca0c1ec95f2b63840f9e4c226c \
--hash=sha256:2eeb9e1ecd48ac1d352608bfe0dc1ed78a397698035a1796cf72f0c9d905d219 \
--hash=sha256:302a31400de0280f17c4ce67a73444a7a069f228db64048e4ce555cd0c02fbc4 \
--hash=sha256:39ae788dcdc404c07ef7aac9b11925185ea0831b985db0bbc43f95acdbd1c2ce \
--hash=sha256:39cbaf8fe3fa3515d35fb790465db4dc1ff45e58e1e00cbaf8b714e85437f039 \
--hash=sha256:40466adfa071f58bfa448d90f9623d6aff67c6d86de6fc60be47a26388f6c74d \
--hash=sha256:489959e2d52f7f1fe8ea275fecde6911d454df465265bf3ec51b3e755e769a5e \
--hash=sha256:4a3c36284c219a4d2694e52b2582fe5d5f0ecaf94a22cf0ea959b527dbd8a2a6 \
--hash=sha256:4abf9a9e0841b844736d1ae8ead2b583d2cd212815eab15391b702bde17477a7 \
--hash=sha256:4af5d1891eebef430038ea4981957d31b1eb70aca14b906660c3ac1c3e7a8612 \
--hash=sha256:5499d65d4a1978dccf0a9c2c0d12415e16d4995ffad7a0bc4f72cc66691cf9f2 \
--hash=sha256:5a3578b182c21b8af3c49619eb4cd0b9127fa60791e621b34217d65209722002 \
--hash=sha256:613daadd72c71b1488742cafb2c3b381c39d0c9bb8c6cc157aa2d5ea45cc2efc \
--hash=sha256:6179808ebd1ebc42b1e2f221a23c28a22d3bc8f79209ae4a3cc114693c380bff \
--hash=sha256:7041efe3a93d0975d2ad16451720932e8a3d164be8521bfd0873b27ac917b77a \
--hash=sha256:78fb35d07423f25efd0fc90d0d4710ae83cfc86443a32192b0c6cb8475ec79a5 \
--hash=sha256:79c3058ccbe1fa37356a73c9d3c0475ec935ab528f5b76d56fc002a5a23407c7 \
--hash=sha256:84c1dae0c0a21eea245b5691286fe6470dc797d5e86e0c26b57a3afd1e750b48 \
--hash=sha256:862ad0a5c94670f2bd6f64fff671bd2045af5f4ed428a3f2f69fa5e52483f86a \
--hash=sha256:9aca916724d0802d3e70dc68adeff893efece01dffe7252ee3ae0053f1f1990f \
--hash=sha256:9aea3c7bab4276212e5ac63d28e6bd72a79ff058d57e06926dfe30a52451d943 \
--hash=sha256:a56036c08645aa6041d435a50103428f0682effdc67f5038de47cea5e4221d6f \
--hash=sha256:a5efe366bf0545a1a5a917787659b445ba16442ae4093f102204f42a9da1ecbc \
--hash=sha256:afbcd2ed0c1145e24dd3df8440a429688a1614b83424bc871371b176bed429f9 \
--hash=sha256:b07f391fd85e3d07514c05fb40c5573b398d0063ab2bada6eb09949ec6004772 \
--hash=sha256:b0f556c74c6f0f481b61d917e48c341cdfbb80cc3391511345aed4ce6fb52fdc \
--hash=sha256:b671b75ae88139b1dd022fa4aa66ba419abd66f98869af55a342cb9257a1831e \
--hash=sha256:b6d718f1b7cd30adb02c2a46dde0f25a84a9de8865126e0fff7d0162332d6b92 \
--hash=sha256:ba4bb4c5a0cac802ff485fa1e57f7763df5efa0ad4ee10c2693ecc5a018d2c1a \
--hash=sha256:ba86f931bf925e9561ccd6cb978acb163e38c425990927feb38be10c894fa937 \
--hash=sha256:c1929afea64da48ec59eca9055d7ec7e5955801489ac40ac2a19dde19e7edad9 \
--hash=sha256:c28c7441638c472bfb794f424bd560a22c7afce764cd99196e8d70fbc4d14e85 \
--hash=sha256:c4efa051799703dc37c072e22af1f0e4c77069a78fb37caf70e26414c738ca1d \
--hash=sha256:cc98c8bcaa07150d3f5d7c4bd264eaa4fdd4a4dfb8fd3f9d62565ae5c4aba227 \
--hash=sha256:cd0aa9a043c38901925ae1bba49e1e638f2d9c3cdf1b8000868993c642deb7f2 \
--hash=sha256:cdd769da7add8498658d881ce0eeb4c35ea1baac62e24c5a030c50f859f29724 \
--hash=sha256:d08459f7f7748398a6cc65eb7f88aa7ef5731097be2ddfba544be4b558acd900 \
--hash=sha256:dc47cec184e66953f635254e5381df8a22012a2308168c069230b1a95079ccd0 \
--hash=sha256:e3f6887d2bdfb5752d5544860bd6b778e53ebfaf4ab6c3f9d7fd388445429d41 \
--hash=sha256:e6b4de1ba2f3028fafa0d82222d1e91b729334c8d65fbf04290c65c09d7457e1 \
--hash=sha256:ee2a1510e06dfc7706ea9afad363efe222818a1eafa59abc32d9bbcd8465fba7 \
--hash=sha256:f199d58f3fd7dfa0d447bc255ff22571f2e4e5e5748bfd1c41370454723cb053 \
--hash=sha256:f1ba6bbd28ad926d130f0af8016f3a2930baa013c2128cfff46ca76432f50669 \
--hash=sha256:f847701d77371d90783c0ce6cfdb7ebde4053882c2aaba7255c70ae3c3eb7af0
# via rosbags (setup.cfg)

98
rosbags/setup.cfg Normal file
View File

@ -0,0 +1,98 @@
[metadata]
name = rosbags
version = 0.9.15
author = Ternaris
author_email = team@ternaris.com
home_page = https://gitlab.com/ternaris/rosbags
description = Pure Python library to read, modify, convert, and write rosbag files.
long_description = file: README.rst
long_description_content_type = text/x-rst
keywords =
cdr
conversion
deserialization
idl
mcap
message
msg
reader
ros
rosbag
rosbag2
serialization
writer
license = Apache 2.0
license_files = LICENSE.txt
platform = any
classifiers =
Development Status :: 4 - Beta
License :: OSI Approved :: Apache Software License
Programming Language :: Python
Programming Language :: Python :: 3 :: Only
Programming Language :: Python :: 3.8
Programming Language :: Python :: 3.9
Programming Language :: Python :: 3.10
Programming Language :: Python :: 3.11
Topic :: Scientific/Engineering
Typing :: Typed
project_urls =
Code = https://gitlab.com/ternaris/rosbags
Documentation = https://ternaris.gitlab.io/rosbags
Issue tracker = https://gitlab.com/ternaris/rosbags/issues
[options]
include_package_data = true
package_dir =
= src
packages = find_namespace:
python_requires =
>=3.8.2
install_requires =
lz4
numpy
ruamel.yaml
zstandard
[options.entry_points]
console_scripts =
rosbags-convert = rosbags.convert.__main__:main
[options.extras_require]
dev =
darglint
flake8
flake8-annotations
flake8-bugbear
flake8-commas
flake8-comprehensions
flake8-docstrings
flake8-fixme
flake8-isort
flake8-mutable
flake8-print
flake8-pyprojecttoml
flake8-pytest-style
flake8-quotes
flake8-return
flake8-simplify
flake8-type-checking
flake8-use-fstring
mypy
pep8-naming
pylint
pytest
pytest-cov
sphinx
sphinx-autodoc-typehints
sphinx-rtd-theme
toml # required by yapf
yapf
[options.package_data]
* = py.typed
[options.packages.find]
where = src
[sdist]
formats = gztar, zip

View File

@ -0,0 +1,16 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbags file format conversion.
Conversion function transforms files from rosbag1 format to the latest rosbag2
format. It automatically matches ROS1 message types to their ROS2 counterparts
and adds custom types not present in the type system.
"""
from .converter import ConverterError, convert
__all__ = [
'ConverterError',
'convert',
]

View File

@ -0,0 +1,82 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""CLI tool for rosbag conversion."""
from __future__ import annotations
import argparse
import sys
from pathlib import Path
from typing import TYPE_CHECKING
from .converter import ConverterError, convert
if TYPE_CHECKING:
from typing import Callable
def pathtype(exists: bool = True) -> Callable[[str], Path]:
"""Path argument for argparse.
Args:
exists: Path should exists in filesystem.
Returns:
Argparse type function.
"""
def topath(pathname: str) -> Path:
path = Path(pathname)
if exists != path.exists():
raise argparse.ArgumentTypeError(
f'{path} should {"exist" if exists else "not exist"}.',
)
return path
return topath
def main() -> None:
"""Parse cli arguments and run conversion."""
parser = argparse.ArgumentParser(description='Convert between rosbag1 and rosbag2.')
parser.add_argument(
'src',
type=pathtype(),
help='source path to read rosbag1 or rosbag2 from',
)
parser.add_argument(
'--dst',
type=pathtype(exists=False),
help='destination path for converted rosbag',
)
topic_group = parser.add_argument_group('filtering').add_mutually_exclusive_group()
topic_group.add_argument(
'--exclude-topic',
action='append',
default=[],
dest='exclude_topics',
help='topic to exclude from conversion, even if included explicitly',
)
topic_group.add_argument(
'--include-topic',
action='append',
default=[],
dest='include_topics',
help='topic to include in conversion, instead of all',
)
args = parser.parse_args()
if args.dst is not None and (args.src.suffix == '.bag') == (args.dst.suffix == '.bag'):
print('Source and destination rosbag versions must differ.') # noqa: T201
sys.exit(1)
try:
convert(**args.__dict__)
except ConverterError as err:
print(f'ERROR: {err}') # noqa: T201
sys.exit(1)
if __name__ == '__main__':
main()

View File

@ -0,0 +1,239 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag1 to Rosbag2 Converter."""
from __future__ import annotations
from typing import TYPE_CHECKING
from rosbags.interfaces import Connection, ConnectionExtRosbag1, ConnectionExtRosbag2
from rosbags.rosbag1 import Reader as Reader1
from rosbags.rosbag1 import ReaderError as ReaderError1
from rosbags.rosbag1 import Writer as Writer1
from rosbags.rosbag1 import WriterError as WriterError1
from rosbags.rosbag2 import Reader as Reader2
from rosbags.rosbag2 import ReaderError as ReaderError2
from rosbags.rosbag2 import Writer as Writer2
from rosbags.rosbag2 import WriterError as WriterError2
from rosbags.serde import cdr_to_ros1, ros1_to_cdr
from rosbags.typesys import get_types_from_msg, register_types
from rosbags.typesys.msg import generate_msgdef
if TYPE_CHECKING:
from pathlib import Path
from typing import Any, Optional, Sequence
LATCH = """
- history: 3
depth: 0
reliability: 1
durability: 1
deadline:
sec: 2147483647
nsec: 4294967295
lifespan:
sec: 2147483647
nsec: 4294967295
liveliness: 1
liveliness_lease_duration:
sec: 2147483647
nsec: 4294967295
avoid_ros_namespace_conventions: false
""".strip()
class ConverterError(Exception):
"""Converter Error."""
def upgrade_connection(rconn: Connection) -> Connection:
"""Convert rosbag1 connection to rosbag2 connection.
Args:
rconn: Rosbag1 connection.
Returns:
Rosbag2 connection.
"""
assert isinstance(rconn.ext, ConnectionExtRosbag1)
return Connection(
rconn.id,
rconn.topic,
rconn.msgtype,
'',
'',
0,
ConnectionExtRosbag2(
'cdr',
LATCH if rconn.ext.latching else '',
),
None,
)
def downgrade_connection(rconn: Connection) -> Connection:
"""Convert rosbag2 connection to rosbag1 connection.
Args:
rconn: Rosbag2 connection.
Returns:
Rosbag1 connection.
"""
assert isinstance(rconn.ext, ConnectionExtRosbag2)
msgdef, md5sum = generate_msgdef(rconn.msgtype)
return Connection(
rconn.id,
rconn.topic,
rconn.msgtype,
msgdef,
md5sum,
-1,
ConnectionExtRosbag1(
None,
int('durability: 1' in rconn.ext.offered_qos_profiles),
),
None,
)
def convert_1to2(
src: Path,
dst: Path,
exclude_topics: Sequence[str],
include_topics: Sequence[str],
) -> None:
"""Convert Rosbag1 to Rosbag2.
Args:
src: Rosbag1 path.
dst: Rosbag2 path.
exclude_topics: Topics to exclude from conversion, even if included explicitly.
include_topics: Topics to include in conversion, instead of all.
Raises:
ConverterError: If all connections are excluded.
"""
with Reader1(src) as reader, Writer2(dst) as writer:
typs: dict[str, Any] = {}
connmap: dict[int, Connection] = {}
connections = [
x for x in reader.connections
if x.topic not in exclude_topics and (not include_topics or x.topic in include_topics)
]
if not connections:
raise ConverterError('No connections left for conversion.')
for rconn in connections:
candidate = upgrade_connection(rconn)
assert isinstance(candidate.ext, ConnectionExtRosbag2)
for conn in writer.connections:
assert isinstance(conn.ext, ConnectionExtRosbag2)
if (
conn.topic == candidate.topic and conn.msgtype == candidate.msgtype and
conn.ext == candidate.ext
):
break
else:
conn = writer.add_connection(
candidate.topic,
candidate.msgtype,
candidate.ext.serialization_format,
candidate.ext.offered_qos_profiles,
)
connmap[rconn.id] = conn
typs.update(get_types_from_msg(rconn.msgdef, rconn.msgtype))
register_types(typs)
for rconn, timestamp, data in reader.messages(connections=connections):
data = ros1_to_cdr(data, rconn.msgtype)
writer.write(connmap[rconn.id], timestamp, data)
def convert_2to1(
src: Path,
dst: Path,
exclude_topics: Sequence[str],
include_topics: Sequence[str],
) -> None:
"""Convert Rosbag2 to Rosbag1.
Args:
src: Rosbag2 path.
dst: Rosbag1 path.
exclude_topics: Topics to exclude from conversion, even if included explicitly.
include_topics: Topics to include in conversion, instead of all.
Raises:
ConverterError: If all connections are excluded.
"""
with Reader2(src) as reader, Writer1(dst) as writer:
connmap: dict[int, Connection] = {}
connections = [
x for x in reader.connections
if x.topic not in exclude_topics and (not include_topics or x.topic in include_topics)
]
if not connections:
raise ConverterError('No connections left for conversion.')
for rconn in connections:
candidate = downgrade_connection(rconn)
assert isinstance(candidate.ext, ConnectionExtRosbag1)
for conn in writer.connections:
assert isinstance(conn.ext, ConnectionExtRosbag1)
if (
conn.topic == candidate.topic and conn.md5sum == candidate.md5sum and
conn.ext.latching == candidate.ext.latching
):
break
else:
conn = writer.add_connection(
candidate.topic,
candidate.msgtype,
candidate.msgdef,
candidate.md5sum,
candidate.ext.callerid,
candidate.ext.latching,
)
connmap[rconn.id] = conn
for rconn, timestamp, data in reader.messages(connections=connections):
data = cdr_to_ros1(data, rconn.msgtype)
writer.write(connmap[rconn.id], timestamp, data)
def convert(
src: Path,
dst: Optional[Path],
exclude_topics: Sequence[str] = (),
include_topics: Sequence[str] = (),
) -> None:
"""Convert between Rosbag1 and Rosbag2.
Args:
src: Source rosbag.
dst: Destination rosbag.
exclude_topics: Topics to exclude from conversion, even if included explicitly.
include_topics: Topics to include in conversion, instead of all.
Raises:
ConverterError: An error occured during reading, writing, or
converting.
"""
upgrade = src.suffix == '.bag'
dst = dst if dst else src.with_suffix('' if upgrade else '.bag')
if dst.exists():
raise ConverterError(f'Output path {str(dst)!r} exists already.')
func = convert_1to2 if upgrade else convert_2to1
try:
func(src, dst, exclude_topics, include_topics)
except (ReaderError1, ReaderError2) as err:
raise ConverterError(f'Reading source bag: {err}') from err
except (WriterError1, WriterError2) as err:
raise ConverterError(f'Writing destination bag: {err}') from err
except Exception as err:
raise ConverterError(f'Converting rosbag: {err!r}') from err

View File

View File

@ -0,0 +1,10 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Highlevel interfaces for rosbags."""
from .anyreader import AnyReader, AnyReaderError
__all__ = [
'AnyReader',
'AnyReaderError',
]

View File

@ -0,0 +1,269 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Tools for reading all rosbag versions with unified api."""
from __future__ import annotations
from contextlib import suppress
from dataclasses import dataclass
from heapq import merge
from itertools import groupby
from typing import TYPE_CHECKING
from rosbags.interfaces import TopicInfo
from rosbags.rosbag1 import Reader as Reader1
from rosbags.rosbag1 import ReaderError as ReaderError1
from rosbags.rosbag2 import Reader as Reader2
from rosbags.rosbag2 import ReaderError as ReaderError2
from rosbags.serde import deserialize_cdr, deserialize_ros1
from rosbags.typesys import get_types_from_msg, register_types, types
from rosbags.typesys.idl import get_types_from_idl
if TYPE_CHECKING:
import sys
from pathlib import Path
from types import TracebackType
from typing import Any, Generator, Iterable, Literal, Optional, Sequence, Type, Union
from rosbags.interfaces import Connection
from rosbags.typesys.base import Typesdict
from rosbags.typesys.register import Typestore
if sys.version_info < (3, 10):
from typing_extensions import TypeGuard
else:
from typing import TypeGuard
class AnyReaderError(Exception):
"""Reader error."""
ReaderErrors = (ReaderError1, ReaderError2)
def is_reader1(val: Union[Sequence[Reader1], Sequence[Reader2]]) -> TypeGuard[Sequence[Reader1]]:
"""Determine wether all items are Reader1 instances."""
return all(isinstance(x, Reader1) for x in val)
@dataclass
class SimpleTypeStore:
"""Simple type store implementation."""
FIELDDEFS: Typesdict # pylint: disable=invalid-name
def __hash__(self) -> int:
"""Create hash."""
return id(self)
class AnyReader:
"""Unified rosbag1 and rosbag2 reader."""
readers: Union[Sequence[Reader1], Sequence[Reader2]]
typestore: Typestore
def __init__(self, paths: Sequence[Path]):
"""Initialize RosbagReader.
Opens one or multiple rosbag1 recordings or a single rosbag2 recording.
Args:
paths: Paths to multiple rosbag1 files or single rosbag2 directory.
Raises:
AnyReaderError: If paths do not exist or multiple rosbag2 files are given.
"""
if not paths:
raise AnyReaderError('Must call with at least one path.')
if len(paths) > 1 and any((x / 'metadata.yaml').exists() for x in paths):
raise AnyReaderError('Opening of multiple rosbag2 recordings is not supported.')
if missing := [x for x in paths if not x.exists()]:
raise AnyReaderError(f'The following paths are missing: {missing!r}')
self.paths = paths
self.is2 = (paths[0] / 'metadata.yaml').exists()
self.isopen = False
self.connections: list[Connection] = []
try:
if self.is2:
self.readers = [Reader2(x) for x in paths]
else:
self.readers = [Reader1(x) for x in paths]
except ReaderErrors as err:
raise AnyReaderError(*err.args) from err
self.typestore = SimpleTypeStore({})
def _deser_ros1(self, rawdata: bytes, typ: str) -> object:
"""Deserialize ROS1 message."""
return deserialize_ros1(rawdata, typ, self.typestore)
def _deser_ros2(self, rawdata: bytes, typ: str) -> object:
"""Deserialize CDR message."""
return deserialize_cdr(rawdata, typ, self.typestore)
def deserialize(self, rawdata: bytes, typ: str) -> object:
"""Deserialize message with appropriate helper."""
return self._deser_ros2(rawdata, typ) if self.is2 else self._deser_ros1(rawdata, typ)
def open(self) -> None:
"""Open rosbags."""
assert not self.isopen
rollback = []
try:
for reader in self.readers:
reader.open()
rollback.append(reader)
except ReaderErrors as err:
for reader in rollback:
with suppress(*ReaderErrors):
reader.close()
raise AnyReaderError(*err.args) from err
for key in [
'builtin_interfaces/msg/Time',
'builtin_interfaces/msg/Duration',
'std_msgs/msg/Header',
]:
self.typestore.FIELDDEFS[key] = types.FIELDDEFS[key]
attr = key.replace('/', '__')
setattr(self.typestore, attr, getattr(types, attr))
typs: dict[str, Any] = {}
if self.is2:
reader = self.readers[0]
assert isinstance(reader, Reader2)
if reader.metadata['storage_identifier'] == 'mcap':
for connection in reader.connections:
if connection.md5sum:
if connection.md5sum == 'idl':
typ = get_types_from_idl(connection.msgdef)
else:
typ = get_types_from_msg(connection.msgdef, connection.msgtype)
typs.update(typ)
register_types(typs, self.typestore)
else:
for key, value in types.FIELDDEFS.items():
self.typestore.FIELDDEFS[key] = value
attr = key.replace('/', '__')
setattr(self.typestore, attr, getattr(types, attr))
else:
for reader in self.readers:
for connection in reader.connections:
typs.update(get_types_from_msg(connection.msgdef, connection.msgtype))
register_types(typs, self.typestore)
self.connections = [y for x in self.readers for y in x.connections]
self.isopen = True
def close(self) -> None:
"""Close rosbag."""
assert self.isopen
for reader in self.readers:
with suppress(*ReaderErrors):
reader.close()
self.isopen = False
def __enter__(self) -> AnyReader:
"""Open rosbags when entering contextmanager."""
self.open()
return self
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType],
) -> Literal[False]:
"""Close rosbags when exiting contextmanager."""
self.close()
return False
@property
def duration(self) -> int:
"""Duration in nanoseconds between earliest and latest messages."""
return self.end_time - self.start_time
@property
def start_time(self) -> int:
"""Timestamp in nanoseconds of the earliest message."""
return min(x.start_time for x in self.readers)
@property
def end_time(self) -> int:
"""Timestamp in nanoseconds after the latest message."""
return max(x.end_time for x in self.readers)
@property
def message_count(self) -> int:
"""Total message count."""
return sum(x.message_count for x in self.readers)
@property
def topics(self) -> dict[str, TopicInfo]:
"""Topics stored in the rosbags."""
assert self.isopen
if self.is2:
assert isinstance(self.readers[0], Reader2)
return self.readers[0].topics
assert is_reader1(self.readers)
def summarize(names_infos: Iterable[tuple[str, TopicInfo]]) -> TopicInfo:
"""Summarize topic infos."""
infos = [x[1] for x in names_infos]
return TopicInfo(
msgtypes.pop() if len(msgtypes := {x.msgtype for x in infos}) == 1 else None,
msgdefs.pop() if len(msgdefs := {x.msgdef for x in infos}) == 1 else None,
sum(x.msgcount for x in infos),
sum((x.connections for x in infos), []),
)
return {
name: summarize(infos) for name, infos in groupby(
sorted(
(x for reader in self.readers for x in reader.topics.items()),
key=lambda x: x[0],
),
key=lambda x: x[0],
)
}
def messages(
self,
connections: Iterable[Any] = (),
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Any, int, bytes], None, None]:
"""Read messages from bags.
Args:
connections: Iterable with connections to filter for. An empty
iterable disables filtering on connections.
start: Yield only messages at or after this timestamp (ns).
stop: Yield only messages before this timestamp (ns).
Yields:
Tuples of connection, timestamp (ns), and rawdata.
"""
assert self.isopen
def get_owner(connection: Connection) -> Union[Reader1, Reader2]:
assert isinstance(connection.owner, (Reader1, Reader2))
return connection.owner
if connections:
generators = [
reader.messages(connections=list(conns), start=start, stop=stop) for reader, conns
in groupby(sorted(connections, key=lambda x: id(get_owner(x))), key=get_owner)
]
else:
generators = [reader.messages(start=start, stop=stop) for reader in self.readers]
yield from merge(*generators, key=lambda x: x[1])

View File

View File

@ -0,0 +1,46 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Shared interfaces."""
from __future__ import annotations
from typing import TYPE_CHECKING, NamedTuple
if TYPE_CHECKING:
from typing import Optional, Union
class ConnectionExtRosbag1(NamedTuple):
"""Rosbag1 specific connection extensions."""
callerid: Optional[str]
latching: Optional[int]
class ConnectionExtRosbag2(NamedTuple):
"""Rosbag2 specific connection extensions."""
serialization_format: str
offered_qos_profiles: str
class Connection(NamedTuple):
"""Connection information."""
id: int
topic: str
msgtype: str
msgdef: str
md5sum: str
msgcount: int
ext: Union[ConnectionExtRosbag1, ConnectionExtRosbag2]
owner: object
class TopicInfo(NamedTuple):
"""Topic information."""
msgtype: Optional[str]
msgdef: Optional[str]
msgcount: int
connections: list[Connection]

View File

View File

@ -0,0 +1,21 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbags support for rosbag1 files.
Readers and writers provide access to metadata and raw message content saved
in the rosbag1 format.
Supported versions:
- Rosbag1 v2.0
"""
from .reader import Reader, ReaderError
from .writer import Writer, WriterError
__all__ = [
'Reader',
'ReaderError',
'Writer',
'WriterError',
]

View File

View File

@ -0,0 +1,687 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag1 v2.0 reader."""
from __future__ import annotations
import heapq
import os
import re
import struct
from bz2 import decompress as bz2_decompress
from collections import defaultdict
from enum import Enum, IntEnum
from functools import reduce
from io import BytesIO
from itertools import groupby
from pathlib import Path
from typing import TYPE_CHECKING, Any, Dict, NamedTuple
from lz4.frame import decompress as lz4_decompress
from rosbags.interfaces import Connection, ConnectionExtRosbag1, TopicInfo
from rosbags.typesys.msg import normalize_msgtype
if TYPE_CHECKING:
from types import TracebackType
from typing import (
BinaryIO,
Callable,
Generator,
Iterable,
Literal,
Optional,
Tuple,
Type,
Union,
)
Unpack = Callable[[bytes], Tuple[int]]
UnpackFrom = Callable[[bytes, int], Tuple[int]]
class ReaderError(Exception):
"""Reader Error."""
class Compression(Enum):
"""Compression mode."""
NONE = 'none'
BZ2 = 'bz2'
LZ4 = 'lz4'
class RecordType(IntEnum):
"""Record type."""
MSGDATA = 2
BAGHEADER = 3
IDXDATA = 4
CHUNK = 5
CHUNK_INFO = 6
CONNECTION = 7
class ChunkInfo(NamedTuple):
"""Chunk information."""
pos: int
start_time: int
end_time: int
connection_counts: dict[int, int]
class Chunk(NamedTuple):
"""Chunk metadata."""
datasize: int
datapos: int
decompressor: Callable[[bytes], bytes]
class IndexData(NamedTuple):
"""Index data."""
time: int
chunk_pos: int
offset: int
def __lt__(self, other: tuple[int, ...]) -> bool:
"""Compare by time only."""
return self.time < other[0]
def __le__(self, other: tuple[int, ...]) -> bool:
"""Compare by time only."""
return self.time <= other[0]
def __eq__(self, other: object) -> bool:
"""Compare by time only."""
if isinstance(other, IndexData):
return self.time == other[0]
return NotImplemented # pragma: no cover
def __ge__(self, other: tuple[int, ...]) -> bool:
"""Compare by time only."""
return self.time >= other[0]
def __gt__(self, other: tuple[int, ...]) -> bool:
"""Compare by time only."""
return self.time > other[0]
def __ne__(self, other: object) -> bool:
"""Compare by time only."""
if isinstance(other, IndexData):
return self.time != other[0]
return NotImplemented # pragma: no cover
deserialize_uint8: Unpack = struct.Struct('<B').unpack # type: ignore
deserialize_uint32: UnpackFrom = struct.Struct('<L').unpack_from # type: ignore
deserialize_uint64: Unpack = struct.Struct('<Q').unpack # type: ignore
def deserialize_time(val: bytes) -> int:
"""Deserialize time value.
Args:
val: Serialized bytes.
Returns:
Deserialized value.
"""
unpacked: tuple[int, int] = struct.unpack('<LL', val) # type: ignore
sec, nsec = unpacked
return sec * 10**9 + nsec
class Header(Dict[str, Any]):
"""Record header."""
def get_uint8(self, name: str) -> int:
"""Get uint8 value from field.
Args:
name: Name of field.
Returns:
Deserialized value.
Raises:
ReaderError: Field not present or not deserializable.
"""
try:
return deserialize_uint8(self[name])[0]
except (KeyError, struct.error) as err:
raise ReaderError(f'Could not read uint8 field {name!r}.') from err
def get_uint32(self, name: str) -> int:
"""Get uint32 value from field.
Args:
name: Name of field.
Returns:
Deserialized value.
Raises:
ReaderError: Field not present or not deserializable.
"""
try:
return deserialize_uint32(self[name], 0)[0]
except (KeyError, struct.error) as err:
raise ReaderError(f'Could not read uint32 field {name!r}.') from err
def get_uint64(self, name: str) -> int:
"""Get uint64 value from field.
Args:
name: Name of field.
Returns:
Deserialized value.
Raises:
ReaderError: Field not present or not deserializable.
"""
try:
return deserialize_uint64(self[name])[0]
except (KeyError, struct.error) as err:
raise ReaderError(f'Could not read uint64 field {name!r}.') from err
def get_string(self, name: str) -> str:
"""Get string value from field.
Args:
name: Name of field.
Returns:
Deserialized value.
Raises:
ReaderError: Field not present or not deserializable.
"""
try:
value = self[name]
assert isinstance(value, bytes)
return value.decode()
except (KeyError, ValueError) as err:
raise ReaderError(f'Could not read string field {name!r}.') from err
def get_time(self, name: str) -> int:
"""Get time value from field.
Args:
name: Name of field.
Returns:
Deserialized value.
Raises:
ReaderError: Field not present or not deserializable.
"""
try:
return deserialize_time(self[name])
except (KeyError, struct.error) as err:
raise ReaderError(f'Could not read time field {name!r}.') from err
@classmethod
def read(cls: Type[Header], src: BinaryIO, expect: Optional[RecordType] = None) -> Header:
"""Read header from file handle.
Args:
src: File handle.
expect: Expected record op.
Returns:
Header object.
Raises:
ReaderError: Header could not parsed.
"""
try:
binary = read_bytes(src, read_uint32(src))
except ReaderError as err:
raise ReaderError('Header could not be read from file.') from err
header = cls()
pos = 0
length = len(binary)
while pos < length:
try:
size = deserialize_uint32(binary, pos)[0]
except struct.error as err:
raise ReaderError('Header field size could not be read.') from err
pos += 4
if pos + size > length:
raise ReaderError('Declared field size is too large for header.')
name, sep, value = binary[pos:pos + size].partition(b'=')
if not sep:
raise ReaderError('Header field could not be parsed.')
pos += size
header[name.decode()] = value
if expect:
have = header.get_uint8('op')
if expect != have:
raise ReaderError(f'Record of type {RecordType(have).name!r} is unexpected.')
return header
def read_uint32(src: BinaryIO) -> int:
"""Read uint32 from source.
Args:
src: File handle.
Returns:
Uint32 value.
Raises:
ReaderError: Value unreadable or not deserializable.
"""
try:
return deserialize_uint32(src.read(4), 0)[0]
except struct.error as err:
raise ReaderError('Could not read uint32.') from err
def read_bytes(src: BinaryIO, size: int) -> bytes:
"""Read bytes from source.
Args:
src: File handle.
size: Number of bytes to read.
Returns:
Read bytes.
Raises:
ReaderError: Not enough bytes available.
"""
data = src.read(size)
if len(data) != size:
raise ReaderError(f'Got only {len(data)} of requested {size} bytes.')
return data
def normalize(name: str) -> str:
"""Normalize topic name.
Args:
name: Topic name.
Returns:
Normalized name.
"""
return f'{"/" * (name[0] == "/")}{"/".join(x for x in name.split("/") if x)}'
class Reader:
"""Rosbag 1 version 2.0 reader.
This class is designed for a ROS2 world, it will automatically normalize
message type names to be in line with their ROS2 counterparts.
"""
# pylint: disable=too-many-instance-attributes
def __init__(self, path: Union[str, Path]):
"""Initialize.
Args:
path: Filesystem path to bag.
Raises:
ReaderError: Path does not exist.
"""
self.path = Path(path)
if not self.path.exists():
raise ReaderError(f'File {str(self.path)!r} does not exist.')
self.bio: Optional[BinaryIO] = None
self.connections: list[Connection] = []
self.indexes: dict[int, list[IndexData]] = {}
self.index_data_header_offsets: Optional[tuple[int, int]] = None
self.chunk_infos: list[ChunkInfo] = []
self.chunks: dict[int, Chunk] = {}
self.current_chunk: tuple[int, BinaryIO] = (-1, BytesIO())
def open(self) -> None:
"""Open rosbag and read metadata."""
try:
self.bio = self.path.open('rb') # pylint: disable=consider-using-with
except OSError as err:
raise ReaderError(f'Could not open file {str(self.path)!r}: {err.strerror}.') from err
try:
magic = self.bio.readline().decode()
if not magic:
raise ReaderError(f'File {str(self.path)!r} seems to be empty.')
matches = re.match(r'#ROSBAG V(\d+).(\d+)\n', magic)
if not matches:
raise ReaderError('File magic is invalid.')
major, minor = matches.groups()
version = int(major) * 100 + int(minor)
if version != 200:
raise ReaderError(f'Bag version {version!r} is not supported.')
header = Header.read(self.bio, RecordType.BAGHEADER)
index_pos = header.get_uint64('index_pos')
conn_count = header.get_uint32('conn_count')
chunk_count = header.get_uint32('chunk_count')
try:
encryptor: Optional[str] = header.get_string('encryptor')
except ReaderError:
encryptor = None
if encryptor:
raise ReaderError(f'Bag encryption {encryptor!r} is not supported.') from None
if index_pos == 0:
raise ReaderError('Bag is not indexed, reindex before reading.')
if chunk_count == 0:
return
self.bio.seek(index_pos)
try:
self.connections = [self.read_connection() for _ in range(conn_count)]
self.chunk_infos = [self.read_chunk_info() for _ in range(chunk_count)]
except ReaderError as err:
raise ReaderError(f'Bag index looks damaged: {err.args}') from None
self.chunks = {}
indexes: dict[int, list[IndexData]] = defaultdict(list)
for chunk_info in self.chunk_infos:
self.bio.seek(chunk_info.pos)
self.chunks[chunk_info.pos] = self.read_chunk()
for _ in range(len(chunk_info.connection_counts)):
self.read_index_data(chunk_info.pos, indexes)
self.indexes = {cid: sorted(x) for cid, x in indexes.items()}
assert all(self.indexes[x.id] for x in self.connections)
self.connections = [
Connection(
*x[0:5],
len(self.indexes[x.id]),
*x[6:],
) for x in self.connections
]
except ReaderError:
self.close()
raise
def close(self) -> None:
"""Close rosbag."""
assert self.bio
self.bio.close()
self.bio = None
@property
def duration(self) -> int:
"""Duration in nanoseconds between earliest and latest messages."""
return self.end_time - self.start_time if self.chunk_infos else 0
@property
def start_time(self) -> int:
"""Timestamp in nanoseconds of the earliest message."""
return min(x.start_time for x in self.chunk_infos) if self.chunk_infos else 2**63 - 1
@property
def end_time(self) -> int:
"""Timestamp in nanoseconds after the latest message."""
return max(x.end_time for x in self.chunk_infos) if self.chunk_infos else 0
@property
def message_count(self) -> int:
"""Total message count."""
return reduce(lambda x, y: x + y, (x.msgcount for x in self.topics.values()), 0)
@property
def topics(self) -> dict[str, TopicInfo]:
"""Topic information."""
topics = {}
for topic, group in groupby(
sorted(self.connections, key=lambda x: x.topic),
key=lambda x: x.topic,
):
connections = list(group)
msgcount = reduce(
lambda x, y: x + y,
(y.connection_counts.get(x.id, 0) for x in connections for y in self.chunk_infos),
)
topics[topic] = TopicInfo(
msgtypes.pop() if len(msgtypes := {x.msgtype for x in connections}) == 1 else None,
msgdefs.pop() if len(msgdefs := {x.msgdef for x in connections}) == 1 else None,
msgcount,
connections,
)
return topics
def read_connection(self) -> Connection:
"""Read connection record from current position."""
assert self.bio
header = Header.read(self.bio, RecordType.CONNECTION)
conn = header.get_uint32('conn')
topic = normalize(header.get_string('topic'))
header = Header.read(self.bio)
typ = header.get_string('type')
md5sum = header.get_string('md5sum')
msgdef = header.get_string('message_definition')
callerid = header.get_string('callerid') if 'callerid' in header else None
latching = int(header.get_string('latching')) if 'latching' in header else None
return Connection(
conn,
topic,
normalize_msgtype(typ),
msgdef,
md5sum,
0,
ConnectionExtRosbag1(
callerid,
latching,
),
self,
)
def read_chunk_info(self) -> ChunkInfo:
"""Read chunk info record from current position."""
assert self.bio
header = Header.read(self.bio, RecordType.CHUNK_INFO)
ver = header.get_uint32('ver')
if ver != 1:
raise ReaderError(f'CHUNK_INFO version {ver} is not supported.')
chunk_pos = header.get_uint64('chunk_pos')
start_time = header.get_time('start_time')
end_time = header.get_time('end_time') + 1
count = header.get_uint32('count')
self.bio.seek(4, os.SEEK_CUR)
return ChunkInfo(
chunk_pos,
start_time,
end_time,
{read_uint32(self.bio): read_uint32(self.bio) for _ in range(count)},
)
def read_chunk(self) -> Chunk:
"""Read chunk record header from current position."""
assert self.bio
header = Header.read(self.bio, RecordType.CHUNK)
compression = header.get_string('compression')
datasize = read_uint32(self.bio)
datapos = self.bio.tell()
self.bio.seek(datasize, os.SEEK_CUR)
try:
decompressor = {
Compression.NONE.value: lambda x: x,
Compression.BZ2.value: bz2_decompress,
Compression.LZ4.value: lz4_decompress,
}[compression]
except KeyError:
raise ReaderError(f'Compression {compression!r} is not supported.') from None
return Chunk(
datasize,
datapos,
decompressor,
)
def read_index_data(self, pos: int, indexes: dict[int, list[IndexData]]) -> None:
"""Read index data from position.
The implementation purposely avoids the generic Header class and
its costly string processing.
Args:
pos: Seek position.
indexes: Accumulated index data.
Raises:
ReaderError: Record unreadable.
"""
assert self.bio
buf = self.bio.read(55)
if not self.index_data_header_offsets:
size, = deserialize_uint32(buf, 0)
assert size == 47
idx = 4
connpos = -1
countpos = -1
while idx < size:
char = buf[idx + 6]
if char == 61: # ord(b'=')
assert buf[idx + 7] == 4
idx += 8
elif char == 114: # ord(b'r')
if (ver := buf[idx + 8]) != 1:
raise ReaderError(f'IDXDATA version {ver} is not supported.')
idx += 12
elif char == 110: # ord(b'n')
connpos = idx + 9
idx += 13
else:
assert char == 117 # ord(b'u')
countpos = idx + 10
idx += 14
self.index_data_header_offsets = (connpos, countpos)
connpos, countpos = self.index_data_header_offsets
conn, = deserialize_uint32(buf, connpos)
count, = deserialize_uint32(buf, countpos)
size, = deserialize_uint32(buf, 51)
assert size == count * 12
index = indexes[conn]
buf = self.bio.read(size)
idx = 0
while idx < size:
time = deserialize_uint32(buf, idx)[0] * 10**9 + deserialize_uint32(buf, idx + 4)[0]
offset, = deserialize_uint32(buf, idx + 8)
idx += 12
index.append(IndexData(time, pos, offset))
def messages(
self,
connections: Iterable[Connection] = (),
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Read messages from bag.
Args:
connections: Iterable with connections to filter for. An empty
iterable disables filtering on connections.
start: Yield only messages at or after this timestamp (ns).
stop: Yield only messages before this timestamp (ns).
Yields:
Tuples of connection, timestamp (ns), and rawdata.
Raises:
ReaderError: Bag not open or data corrupt.
"""
if not self.bio:
raise ReaderError('Rosbag is not open.')
if not connections:
connections = self.connections
connmap = {x.id: x for x in self.connections}
indexes = [self.indexes[x.id] for x in connections]
for entry in heapq.merge(*indexes):
if start and entry.time < start:
continue
if stop and entry.time >= stop:
return
if self.current_chunk[0] != entry.chunk_pos:
self.current_chunk[1].close()
chunk_header = self.chunks[entry.chunk_pos]
self.bio.seek(chunk_header.datapos)
rawbytes = chunk_header.decompressor(read_bytes(self.bio, chunk_header.datasize))
self.current_chunk = (entry.chunk_pos, BytesIO(rawbytes))
chunk = self.current_chunk[1]
chunk.seek(entry.offset)
while True:
header = Header.read(chunk)
have = header.get_uint8('op')
if have != RecordType.CONNECTION:
break
chunk.seek(read_uint32(chunk), os.SEEK_CUR)
if have != RecordType.MSGDATA:
raise ReaderError('Expected to find message data.')
data = read_bytes(chunk, read_uint32(chunk))
connection = connmap[header.get_uint32('conn')]
assert entry.time == header.get_time('time')
yield connection, entry.time, data
def __enter__(self) -> Reader:
"""Open rosbag1 when entering contextmanager."""
self.open()
return self
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType],
) -> Literal[False]:
"""Close rosbag1 when exiting contextmanager."""
self.close()
return False

View File

@ -0,0 +1,411 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag1 writer."""
from __future__ import annotations
import struct
from bz2 import compress as bz2_compress
from collections import defaultdict
from dataclasses import dataclass
from enum import IntEnum, auto
from io import BytesIO
from pathlib import Path
from typing import TYPE_CHECKING, Any, Dict
from lz4.frame import compress as lz4_compress
from rosbags.interfaces import Connection, ConnectionExtRosbag1
from rosbags.typesys.msg import denormalize_msgtype, generate_msgdef
from .reader import RecordType
if TYPE_CHECKING:
from types import TracebackType
from typing import BinaryIO, Callable, Literal, Optional, Type, Union
class WriterError(Exception):
"""Writer Error."""
@dataclass
class WriteChunk:
"""In progress chunk."""
data: BytesIO
pos: int
start: int
end: int
connections: dict[int, list[tuple[int, int]]]
serialize_uint8 = struct.Struct('<B').pack
serialize_uint32 = struct.Struct('<L').pack
serialize_uint64 = struct.Struct('<Q').pack
def serialize_time(val: int) -> bytes:
"""Serialize time value.
Args:
val: Time value.
Returns:
Serialized bytes.
"""
sec, nsec = val // 10**9, val % 10**9
return struct.pack('<LL', sec, nsec)
class Header(Dict[str, Any]):
"""Record header."""
def set_uint32(self, name: str, value: int) -> None:
"""Set field to uint32 value.
Args:
name: Field name.
value: Field value.
"""
self[name] = serialize_uint32(value)
def set_uint64(self, name: str, value: int) -> None:
"""Set field to uint64 value.
Args:
name: Field name.
value: Field value.
"""
self[name] = serialize_uint64(value)
def set_string(self, name: str, value: str) -> None:
"""Set field to string value.
Args:
name: Field name.
value: Field value.
"""
self[name] = value.encode()
def set_time(self, name: str, value: int) -> None:
"""Set field to time value.
Args:
name: Field name.
value: Field value.
"""
self[name] = serialize_time(value)
def write(self, dst: BinaryIO, opcode: Optional[RecordType] = None) -> int:
"""Write to file handle.
Args:
dst: File handle.
opcode: Record type code.
Returns:
Bytes written.
"""
data = b''
if opcode:
keqv = 'op='.encode() + serialize_uint8(opcode)
data += serialize_uint32(len(keqv)) + keqv
for key, value in self.items():
keqv = f'{key}='.encode() + value
data += serialize_uint32(len(keqv)) + keqv
size = len(data)
dst.write(serialize_uint32(size) + data)
return size + 4
class Writer:
"""Rosbag1 writer.
This class implements writing of rosbag1 files in version 2.0. It should be
used as a contextmanager.
"""
class CompressionFormat(IntEnum):
"""Compession formats."""
BZ2 = auto()
LZ4 = auto()
def __init__(self, path: Union[Path, str]):
"""Initialize writer.
Args:
path: Filesystem path to bag.
Raises:
WriterError: Target path exisits already, Writer can only create new rosbags.
"""
path = Path(path)
self.path = path
if path.exists():
raise WriterError(f'{path} exists already, not overwriting.')
self.bio: Optional[BinaryIO] = None
self.compressor: Callable[[bytes], bytes] = lambda x: x
self.compression_format = 'none'
self.connections: list[Connection] = []
self.chunks: list[WriteChunk] = [
WriteChunk(BytesIO(), -1, 2**64, 0, defaultdict(list)),
]
self.chunk_threshold = 1 * (1 << 20)
def set_compression(self, fmt: CompressionFormat) -> None:
"""Enable compression on rosbag1.
This function has to be called before opening.
Args:
fmt: Compressor to use, bz2 or lz4
Raises:
WriterError: Bag already open.
"""
if self.bio:
raise WriterError(f'Cannot set compression, bag {self.path} already open.')
self.compression_format = fmt.name.lower()
bz2: Callable[[bytes], bytes] = lambda x: bz2_compress(x, 9)
lz4: Callable[[bytes], bytes] = lambda x: lz4_compress(x, 0) # type: ignore
self.compressor = {
'bz2': bz2,
'lz4': lz4,
}[self.compression_format]
def open(self) -> None:
"""Open rosbag1 for writing."""
try:
self.bio = self.path.open('xb') # pylint: disable=consider-using-with
except FileExistsError:
raise WriterError(f'{self.path} exists already, not overwriting.') from None
assert self.bio
self.bio.write(b'#ROSBAG V2.0\n')
header = Header()
header.set_uint64('index_pos', 0)
header.set_uint32('conn_count', 0)
header.set_uint32('chunk_count', 0)
size = header.write(self.bio, RecordType.BAGHEADER)
padsize = 4096 - 4 - size
self.bio.write(serialize_uint32(padsize) + b' ' * padsize)
def add_connection( # pylint: disable=too-many-arguments
self,
topic: str,
msgtype: str,
msgdef: Optional[str] = None,
md5sum: Optional[str] = None,
callerid: Optional[str] = None,
latching: Optional[int] = None,
**_kw: Any, # noqa: ANN401
) -> Connection:
"""Add a connection.
This function can only be called after opening a bag.
Args:
topic: Topic name.
msgtype: Message type.
msgdef: Message definiton.
md5sum: Message hash.
callerid: Caller id.
latching: Latching information.
_kw: Ignored to allow consuming dicts from connection objects.
Returns:
Connection id.
Raises:
WriterError: Bag not open or identical topic previously registered.
"""
if not self.bio:
raise WriterError('Bag was not opened.')
if msgdef is None or md5sum is None:
msgdef, md5sum = generate_msgdef(msgtype)
assert msgdef
assert md5sum
connection = Connection(
len(self.connections),
topic,
denormalize_msgtype(msgtype),
msgdef,
md5sum,
-1,
ConnectionExtRosbag1(
callerid,
latching,
),
self,
)
if any(x[1:] == connection[1:] for x in self.connections):
raise WriterError(
f'Connections can only be added once with same arguments: {connection!r}.',
)
bio = self.chunks[-1].data
self.write_connection(connection, bio)
self.connections.append(connection)
return connection
def write(self, connection: Connection, timestamp: int, data: bytes) -> None:
"""Write message to rosbag1.
Args:
connection: Connection to write message to.
timestamp: Message timestamp (ns).
data: Serialized message data.
Raises:
WriterError: Bag not open or connection not registered.
"""
if not self.bio:
raise WriterError('Bag was not opened.')
if connection not in self.connections:
raise WriterError(f'There is no connection {connection!r}.') from None
chunk = self.chunks[-1]
chunk.connections[connection.id].append((timestamp, chunk.data.tell()))
if timestamp < chunk.start:
chunk.start = timestamp
if timestamp > chunk.end:
chunk.end = timestamp
header = Header()
header.set_uint32('conn', connection.id)
header.set_time('time', timestamp)
header.write(chunk.data, RecordType.MSGDATA)
chunk.data.write(serialize_uint32(len(data)))
chunk.data.write(data)
if chunk.data.tell() > self.chunk_threshold:
self.write_chunk(chunk)
@staticmethod
def write_connection(connection: Connection, bio: BinaryIO) -> None:
"""Write connection record."""
header = Header()
header.set_uint32('conn', connection.id)
header.set_string('topic', connection.topic)
header.write(bio, RecordType.CONNECTION)
header = Header()
header.set_string('topic', connection.topic)
header.set_string('type', connection.msgtype)
header.set_string('md5sum', connection.md5sum)
header.set_string('message_definition', connection.msgdef)
assert isinstance(connection.ext, ConnectionExtRosbag1)
if connection.ext.callerid is not None:
header.set_string('callerid', connection.ext.callerid)
if connection.ext.latching is not None:
header.set_string('latching', str(connection.ext.latching))
header.write(bio)
def write_chunk(self, chunk: WriteChunk) -> None:
"""Write open chunk to file."""
assert self.bio
if size := chunk.data.tell() > 0:
chunk.pos = self.bio.tell()
header = Header()
header.set_string('compression', self.compression_format)
header.set_uint32('size', size)
header.write(self.bio, RecordType.CHUNK)
data = self.compressor(chunk.data.getvalue())
self.bio.write(serialize_uint32(len(data)))
self.bio.write(data)
for cid, items in chunk.connections.items():
header = Header()
header.set_uint32('ver', 1)
header.set_uint32('conn', cid)
header.set_uint32('count', len(items))
header.write(self.bio, RecordType.IDXDATA)
self.bio.write(serialize_uint32(len(items) * 12))
for time, offset in items:
self.bio.write(serialize_time(time) + serialize_uint32(offset))
chunk.data.close()
self.chunks.append(WriteChunk(BytesIO(), -1, 2**64, 0, defaultdict(list)))
def close(self) -> None:
"""Close rosbag1 after writing.
Closes open chunks and writes index.
"""
assert self.bio
for chunk in self.chunks:
if chunk.pos == -1:
self.write_chunk(chunk)
index_pos = self.bio.tell()
for connection in self.connections:
self.write_connection(connection, self.bio)
for chunk in self.chunks:
if chunk.pos == -1:
continue
header = Header()
header.set_uint32('ver', 1)
header.set_uint64('chunk_pos', chunk.pos)
header.set_time('start_time', 0 if chunk.start == 2**64 else chunk.start)
header.set_time('end_time', chunk.end)
header.set_uint32('count', len(chunk.connections))
header.write(self.bio, RecordType.CHUNK_INFO)
self.bio.write(serialize_uint32(len(chunk.connections) * 8))
for cid, items in chunk.connections.items():
self.bio.write(serialize_uint32(cid) + serialize_uint32(len(items)))
self.bio.seek(13)
header = Header()
header.set_uint64('index_pos', index_pos)
header.set_uint32('conn_count', len(self.connections))
header.set_uint32('chunk_count', len([x for x in self.chunks if x.pos != -1]))
size = header.write(self.bio, RecordType.BAGHEADER)
padsize = 4096 - 4 - size
self.bio.write(serialize_uint32(padsize) + b' ' * padsize)
self.bio.close()
def __enter__(self) -> Writer:
"""Open rosbag1 when entering contextmanager."""
self.open()
return self
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType],
) -> Literal[False]:
"""Close rosbag1 when exiting contextmanager."""
self.close()
return False

View File

@ -0,0 +1,19 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbags support for rosbag2 files.
Readers and writers provide access to metadata and raw message content saved
in the rosbag2 format.
"""
from .errors import ReaderError
from .reader import Reader
from .writer import Writer, WriterError
__all__ = [
'Reader',
'ReaderError',
'Writer',
'WriterError',
]

View File

@ -0,0 +1,9 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag2 errors."""
from __future__ import annotations
class ReaderError(Exception):
"""Reader Error."""

View File

@ -0,0 +1,60 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag2 metadata."""
from __future__ import annotations
from typing import TypedDict
class StartingTime(TypedDict):
"""Bag starting time."""
nanoseconds_since_epoch: int
class Duration(TypedDict):
"""Bag starting time."""
nanoseconds: int
class TopicMetadata(TypedDict):
"""Topic metadata."""
name: str
type: str
serialization_format: str
offered_qos_profiles: str
class TopicWithMessageCount(TypedDict):
"""Topic with message count."""
message_count: int
topic_metadata: TopicMetadata
class FileInformation(TypedDict):
"""Per file metadata."""
path: str
starting_time: StartingTime
duration: Duration
message_count: int
class Metadata(TypedDict):
"""Rosbag2 metadata file."""
version: int
storage_identifier: str
relative_file_paths: list[str]
starting_time: StartingTime
duration: Duration
message_count: int
compression_format: str
compression_mode: str
topics_with_message_count: list[TopicWithMessageCount]
files: list[FileInformation]
custom_data: dict[str, str]

View File

View File

@ -0,0 +1,273 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag2 reader."""
from __future__ import annotations
from pathlib import Path
from tempfile import TemporaryDirectory
from typing import TYPE_CHECKING, Protocol
import zstandard
from ruamel.yaml import YAML
from ruamel.yaml.error import YAMLError
from rosbags.interfaces import Connection, ConnectionExtRosbag2, TopicInfo
from .errors import ReaderError
from .storage_mcap import ReaderMcap
from .storage_sqlite3 import ReaderSqlite3
if TYPE_CHECKING:
from types import TracebackType
from typing import Generator, Iterable, Literal, Optional, Type, Union
from .metadata import FileInformation, Metadata
class StorageProtocol(Protocol):
"""Storage Protocol."""
def __init__(self, paths: Iterable[Path], connections: Iterable[Connection]):
"""Initialize."""
raise NotImplementedError # pragma: no cover
def open(self) -> None:
"""Open file."""
raise NotImplementedError # pragma: no cover
def close(self) -> None:
"""Close file."""
raise NotImplementedError # pragma: no cover
def get_definitions(self) -> dict[str, tuple[str, str]]:
"""Get message definitions."""
raise NotImplementedError # pragma: no cover
def messages(
self,
connections: Iterable[Connection] = (),
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Get messages from file."""
raise NotImplementedError # pragma: no cover
class Reader:
"""Reader for rosbag2 files.
It implements all necessary features to access metadata and message
streams.
Version history:
- Version 1: Initial format.
- Version 2: Changed field sizes in C++ implementation.
- Version 3: Added compression.
- Version 4: Added QoS metadata to topics, changed relative file paths
- Version 5: Added per file metadata
- Version 6: Added custom_data dict to metadata
"""
# pylint: disable=too-many-instance-attributes
STORAGE_PLUGINS: dict[str, Type[StorageProtocol]] = {
'mcap': ReaderMcap,
'sqlite3': ReaderSqlite3,
}
def __init__(self, path: Union[Path, str]):
"""Open rosbag and check metadata.
Args:
path: Filesystem path to bag.
Raises:
ReaderError: Bag not readable or bag metadata.
"""
path = Path(path)
yamlpath = path / 'metadata.yaml'
self.path = path
try:
yaml = YAML(typ='safe')
dct = yaml.load(yamlpath.read_text())
except OSError as err:
raise ReaderError(f'Could not read metadata at {yamlpath}: {err}.') from None
except YAMLError as exc:
raise ReaderError(f'Could not load YAML from {yamlpath}: {exc}') from None
try:
self.metadata: Metadata = dct['rosbag2_bagfile_information']
if (ver := self.metadata['version']) > 6:
raise ReaderError(f'Rosbag2 version {ver} not supported; please report issue.')
if (storageid := self.metadata['storage_identifier']) not in self.STORAGE_PLUGINS:
raise ReaderError(
f'Storage plugin {storageid!r} not supported; please report issue.',
)
self.paths = [path / Path(x).name for x in self.metadata['relative_file_paths']]
if missing := [x for x in self.paths if not x.exists()]:
raise ReaderError(f'Some database files are missing: {[str(x) for x in missing]!r}')
self.connections = [
Connection(
id=idx + 1,
topic=x['topic_metadata']['name'],
msgtype=x['topic_metadata']['type'],
msgdef='',
md5sum='',
msgcount=x['message_count'],
ext=ConnectionExtRosbag2(
serialization_format=x['topic_metadata']['serialization_format'],
offered_qos_profiles=x['topic_metadata'].get('offered_qos_profiles', ''),
),
owner=self,
) for idx, x in enumerate(self.metadata['topics_with_message_count'])
]
noncdr = {
fmt for x in self.connections if isinstance(x.ext, ConnectionExtRosbag2)
if (fmt := x.ext.serialization_format) != 'cdr'
}
if noncdr:
raise ReaderError(f'Serialization format {noncdr!r} is not supported.')
if self.compression_mode and (cfmt := self.compression_format) != 'zstd':
raise ReaderError(f'Compression format {cfmt!r} is not supported.')
self.files: list[FileInformation] = self.metadata.get('files', [])[:]
self.custom_data: dict[str, str] = self.metadata.get('custom_data', {})
self.tmpdir: Optional[TemporaryDirectory[str]] = None
self.storage: Optional[StorageProtocol] = None
except KeyError as exc:
raise ReaderError(f'A metadata key is missing {exc!r}.') from None
@property
def duration(self) -> int:
"""Duration in nanoseconds between earliest and latest messages."""
nsecs: int = self.metadata['duration']['nanoseconds']
return nsecs + 1 if self.message_count else 0
@property
def start_time(self) -> int:
"""Timestamp in nanoseconds of the earliest message."""
nsecs: int = self.metadata['starting_time']['nanoseconds_since_epoch']
return nsecs if self.message_count else 2**63 - 1
@property
def end_time(self) -> int:
"""Timestamp in nanoseconds after the latest message."""
return self.start_time + self.duration if self.message_count else 0
@property
def message_count(self) -> int:
"""Total message count."""
return self.metadata['message_count']
@property
def compression_format(self) -> Optional[str]:
"""Compression format."""
return self.metadata.get('compression_format', None) or None
@property
def compression_mode(self) -> Optional[str]:
"""Compression mode."""
mode = self.metadata.get('compression_mode', '').lower()
return mode if mode != 'none' else None
@property
def topics(self) -> dict[str, TopicInfo]:
"""Topic information."""
return {x.topic: TopicInfo(x.msgtype, x.msgdef, x.msgcount, [x]) for x in self.connections}
def open(self) -> None:
"""Open rosbag2."""
storage_paths = []
if self.compression_mode == 'file':
self.tmpdir = TemporaryDirectory() # pylint: disable=consider-using-with
tmpdir = self.tmpdir.name
decomp = zstandard.ZstdDecompressor()
for path in self.paths:
storage_file = Path(tmpdir, path.stem)
with path.open('rb') as infile, storage_file.open('wb') as outfile:
decomp.copy_stream(infile, outfile)
storage_paths.append(storage_file)
else:
storage_paths = self.paths[:]
self.storage = self.STORAGE_PLUGINS[self.metadata['storage_identifier']](
storage_paths,
self.connections,
)
self.storage.open()
definitions = self.storage.get_definitions()
for idx, conn in enumerate(self.connections):
if desc := definitions.get(conn.msgtype):
self.connections[idx] = Connection(
id=conn.id,
topic=conn.topic,
msgtype=conn.msgtype,
msgdef=desc[1],
md5sum=desc[0],
msgcount=conn.msgcount,
ext=conn.ext,
owner=conn.owner,
)
def close(self) -> None:
"""Close rosbag2."""
assert self.storage
self.storage.close()
self.storage = None
if self.tmpdir:
self.tmpdir.cleanup()
self.tmpdir = None
def messages(
self,
connections: Iterable[Connection] = (),
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Read messages from bag.
Args:
connections: Iterable with connections to filter for. An empty
iterable disables filtering on connections.
start: Yield only messages at or after this timestamp (ns).
stop: Yield only messages before this timestamp (ns).
Yields:
tuples of connection, timestamp (ns), and rawdata.
Raises:
ReaderError: If reader was not opened.
"""
if not self.storage:
raise ReaderError('Rosbag is not open.')
if self.compression_mode == 'message':
decomp = zstandard.ZstdDecompressor().decompress
for connection, timestamp, data in self.storage.messages(connections, start, stop):
yield connection, timestamp, decomp(data)
else:
yield from self.storage.messages(connections, start, stop)
def __enter__(self) -> Reader:
"""Open rosbag2 when entering contextmanager."""
self.open()
return self
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType],
) -> Literal[False]:
"""Close rosbag2 when exiting contextmanager."""
self.close()
return False

View File

@ -0,0 +1,571 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Mcap storage."""
from __future__ import annotations
import heapq
from io import BytesIO
from struct import iter_unpack, unpack_from
from typing import TYPE_CHECKING, NamedTuple
import zstandard
from lz4.frame import decompress as lz4_decompress
from .errors import ReaderError
if TYPE_CHECKING:
from pathlib import Path
from typing import BinaryIO, Callable, Generator, Iterable, Optional
from rosbags.interfaces import Connection
class Schema(NamedTuple):
"""Schema."""
id: int
name: str
encoding: str
data: str
class Channel(NamedTuple):
"""Channel."""
id: int
schema: str
topic: str
message_encoding: str
metadata: bytes # dict[str, str]
class Chunk(NamedTuple):
"""Chunk."""
start_time: int
end_time: int
size: int
crc: int
compression: str
records: bytes
class ChunkInfo(NamedTuple):
"""Chunk."""
message_start_time: int
message_end_time: int
chunk_start_offset: int
chunk_length: int
message_index_offsets: dict[int, int]
message_index_length: int
compression: str
compressed_size: int
uncompressed_size: int
channel_count: dict[int, int]
class Statistics(NamedTuple):
"""Statistics."""
message_count: int
schema_count: int
channel_count: int
attachement_count: int
metadata_count: int
chunk_count: int
start_time: int
end_time: int
channel_message_counts: bytes
class Msg(NamedTuple):
"""Message wrapper."""
timestamp: int
offset: int
connection: Optional[Connection]
data: Optional[bytes]
def read_sized(bio: BinaryIO) -> bytes:
"""Read one record."""
return bio.read(unpack_from('<Q', bio.read(8))[0])
def skip_sized(bio: BinaryIO) -> None:
"""Read one record."""
bio.seek(unpack_from('<Q', bio.read(8))[0], 1)
def read_bytes(bio: BinaryIO) -> bytes:
"""Read string."""
return bio.read(unpack_from('<I', bio.read(4))[0])
def read_string(bio: BinaryIO) -> str:
"""Read string."""
return bio.read(unpack_from('<I', bio.read(4))[0]).decode()
DECOMPRESSORS: dict[str, Callable[[bytes, int], bytes]] = {
'': lambda x, _: x,
'lz4': lambda x, _: lz4_decompress(x), # type: ignore
'zstd': zstandard.ZstdDecompressor().decompress,
}
def msgsrc(
chunk: ChunkInfo,
channel_map: dict[int, Connection],
start: int,
stop: int,
bio: BinaryIO,
) -> Generator[Msg, None, None]:
"""Yield messages from chunk in time order."""
yield Msg(chunk.message_start_time, 0, None, None)
bio.seek(chunk.chunk_start_offset + 9 + 40 + len(chunk.compression))
compressed_data = bio.read(chunk.compressed_size)
subio = BytesIO(DECOMPRESSORS[chunk.compression](compressed_data, chunk.uncompressed_size))
messages = []
while (offset := subio.tell()) < chunk.uncompressed_size:
op_ = ord(subio.read(1))
if op_ == 0x05:
recio = BytesIO(read_sized(subio))
channel_id, _, log_time, _ = unpack_from(
'<HIQQ',
recio.read(22),
)
if start <= log_time < stop and channel_id in channel_map:
messages.append(
Msg(
log_time,
chunk.chunk_start_offset + offset,
channel_map[channel_id],
recio.read(),
),
)
else:
skip_sized(subio)
yield from sorted(messages, key=lambda x: x.timestamp)
class MCAPFile:
"""Mcap format reader."""
# pylint: disable=too-many-instance-attributes
def __init__(self, path: Path):
"""Initialize."""
self.path = path
self.bio: Optional[BinaryIO] = None
self.data_start = 0
self.data_end = 0
self.schemas: dict[int, Schema] = {}
self.channels: dict[int, Channel] = {}
self.chunks: list[ChunkInfo] = []
self.statistics: Optional[Statistics] = None
def open(self) -> None:
"""Open MCAP."""
try:
self.bio = self.path.open('rb')
except OSError as err:
raise ReaderError(f'Could not open file {str(self.path)!r}: {err.strerror}.') from err
magic = self.bio.read(8)
if not magic:
raise ReaderError(f'File {str(self.path)!r} seems to be empty.')
if magic != b'\x89MCAP0\r\n':
raise ReaderError('File magic is invalid.')
op_ = ord(self.bio.read(1))
if op_ != 0x01:
raise ReaderError('Unexpected record.')
recio = BytesIO(read_sized(self.bio))
profile = read_string(recio)
if profile != 'ros2':
raise ReaderError('Profile is not ros2.')
self.data_start = self.bio.tell()
self.bio.seek(-37, 2)
footer_start = self.bio.tell()
data = self.bio.read()
magic = data[-8:]
if magic != b'\x89MCAP0\r\n':
raise ReaderError('File end magic is invalid.')
assert len(data) == 37
assert data[0:9] == b'\x02\x14\x00\x00\x00\x00\x00\x00\x00', data[0:9]
summary_start, = unpack_from('<Q', data, 9)
if summary_start:
self.data_end = summary_start
self.read_index()
else:
self.data_end = footer_start
def read_index(self) -> None:
"""Read index from file."""
bio = self.bio
assert bio
schemas = self.schemas
channels = self.channels
chunks = self.chunks
bio.seek(self.data_end)
while True:
op_ = ord(bio.read(1))
if op_ in (0x02, 0x0e):
break
if op_ == 0x03:
bio.seek(8, 1)
key, = unpack_from('<H', bio.read(2))
schemas[key] = Schema(
key,
read_string(bio),
read_string(bio),
read_string(bio),
)
elif op_ == 0x04:
bio.seek(8, 1)
key, = unpack_from('<H', bio.read(2))
schema_name = schemas[unpack_from('<H', bio.read(2))[0]].name
channels[key] = Channel(
key,
schema_name,
read_string(bio),
read_string(bio),
read_bytes(bio),
)
elif op_ == 0x08:
bio.seek(8, 1)
chunk = ChunkInfo( # type: ignore
*unpack_from('<QQQQ', bio.read(32), 0),
{
x[0]: x[1] for x in
iter_unpack('<HQ', bio.read(unpack_from('<I', bio.read(4))[0]))
},
*unpack_from('<Q', bio.read(8), 0),
read_string(bio),
*unpack_from('<QQ', bio.read(16), 0),
{},
)
offset_channel = sorted((v, k) for k, v in chunk.message_index_offsets.items())
offsets = [
*[x[0] for x in offset_channel],
chunk.chunk_start_offset + chunk.chunk_length + chunk.message_index_length,
]
chunk.channel_count.update(
{
x[1]: count // 16
for x, y, z in zip(offset_channel, offsets[1:], offsets)
if (count := y - z - 15)
},
)
chunks.append(chunk)
elif op_ == 0x0a:
skip_sized(bio)
elif op_ == 0x0b:
bio.seek(8, 1)
self.statistics = Statistics(
*unpack_from(
'<QHIIIIQQ',
bio.read(42),
0,
),
read_bytes(bio), # type: ignore
)
elif op_ == 0x0d:
skip_sized(bio)
else:
skip_sized(bio)
def close(self) -> None:
"""Close MCAP."""
assert self.bio
self.bio.close()
self.bio = None
def meta_scan(self) -> None:
"""Generate metadata by scanning through file."""
assert self.bio
bio = self.bio
bio_size = self.data_end
bio.seek(self.data_start)
schemas = self.schemas
channels = self.channels
while bio.tell() < bio_size:
op_ = ord(bio.read(1))
if op_ == 0x03:
bio.seek(8, 1)
key, = unpack_from('<H', bio.read(2))
schemas[key] = Schema(
key,
read_string(bio),
read_string(bio),
read_string(bio),
)
elif op_ == 0x04:
bio.seek(8, 1)
key, = unpack_from('<H', bio.read(2))
schema_name = schemas[unpack_from('<H', bio.read(2))[0]].name
channels[key] = Channel(
key,
schema_name,
read_string(bio),
read_string(bio),
read_bytes(bio),
)
elif op_ == 0x06:
bio.seek(8, 1)
_, _, uncompressed_size, _ = unpack_from('<QQQI', bio.read(28))
compression = read_string(bio)
compressed_size, = unpack_from('<Q', bio.read(8))
bio = BytesIO(
DECOMPRESSORS[compression](bio.read(compressed_size), uncompressed_size),
)
bio_size = uncompressed_size
else:
skip_sized(bio)
if bio.tell() == bio_size and bio != self.bio:
bio = self.bio
bio_size = self.data_end
def get_schema_definitions(self) -> dict[str, tuple[str, str]]:
"""Get schema definition."""
if not self.schemas:
self.meta_scan()
return {schema.name: (schema.encoding[4:], schema.data) for schema in self.schemas.values()}
def messages_scan(
self,
connections: Iterable[Connection],
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Read messages by scanning whole bag."""
# pylint: disable=too-many-locals
assert self.bio
bio = self.bio
bio_size = self.data_end
bio.seek(self.data_start)
schemas = self.schemas.copy()
channels = self.channels.copy()
if channels:
read_meta = False
channel_map = {
cid: conn for conn in connections if (
cid := next(
(
cid for cid, x in self.channels.items()
if x.schema == conn.msgtype and x.topic == conn.topic
),
None,
)
)
}
else:
read_meta = True
channel_map = {}
if start is None:
start = 0
if stop is None:
stop = 2**63 - 1
while bio.tell() < bio_size:
op_ = ord(bio.read(1))
if op_ == 0x03 and read_meta:
bio.seek(8, 1)
key, = unpack_from('<H', bio.read(2))
schemas[key] = Schema(
key,
read_string(bio),
read_string(bio),
read_string(bio),
)
elif op_ == 0x04 and read_meta:
bio.seek(8, 1)
key, = unpack_from('<H', bio.read(2))
schema_name = schemas[unpack_from('<H', bio.read(2))[0]].name
channels[key] = Channel(
key,
schema_name,
read_string(bio),
read_string(bio),
read_bytes(bio),
)
conn = next(
(
x for x in connections
if x.topic == channels[key].topic and x.msgtype == schema_name
),
None,
)
if conn:
channel_map[key] = conn
elif op_ == 0x05:
size, channel_id, _, timestamp, _ = unpack_from('<QHIQQ', bio.read(30))
data = bio.read(size - 22)
if start <= timestamp < stop and channel_id in channel_map:
yield channel_map[channel_id], timestamp, data
elif op_ == 0x06:
size, = unpack_from('<Q', bio.read(8))
start_time, end_time, uncompressed_size, _ = unpack_from('<QQQI', bio.read(28))
if read_meta or (start < end_time and start_time < stop):
compression = read_string(bio)
compressed_size, = unpack_from('<Q', bio.read(8))
bio = BytesIO(
DECOMPRESSORS[compression](bio.read(compressed_size), uncompressed_size),
)
bio_size = uncompressed_size
else:
bio.seek(size - 28, 1)
else:
skip_sized(bio)
if bio.tell() == bio_size and bio != self.bio:
bio = self.bio
bio_size = self.data_end
def messages(
self,
connections: Iterable[Connection],
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Read messages from bag.
Args:
connections: Iterable with connections to filter for.
start: Yield only messages at or after this timestamp (ns).
stop: Yield only messages before this timestamp (ns).
Yields:
tuples of connection, timestamp (ns), and rawdata.
"""
assert self.bio
if not self.chunks:
yield from self.messages_scan(connections, start, stop)
return
channel_map = {
cid: conn for conn in connections if (
cid := next(
(
cid for cid, x in self.channels.items()
if x.schema == conn.msgtype and x.topic == conn.topic
),
None,
)
)
}
chunks = [
msgsrc(
x,
channel_map,
start or x.message_start_time,
stop or x.message_end_time + 1,
self.bio,
)
for x in self.chunks
if x.message_start_time != 0 and (start is None or start < x.message_end_time) and
(stop is None or x.message_start_time < stop) and
(any(x.channel_count.get(cid, 0) for cid in channel_map))
]
for timestamp, offset, connection, data in heapq.merge(*chunks):
if not offset:
continue
assert connection
assert data
yield connection, timestamp, data
class ReaderMcap:
"""Mcap storage reader."""
def __init__(
self,
paths: Iterable[Path],
connections: Iterable[Connection],
):
"""Set up storage reader.
Args:
paths: Paths of storage files.
connections: List of connections.
"""
self.paths = paths
self.readers: list[MCAPFile] = []
self.connections = connections
def open(self) -> None:
"""Open rosbag2."""
self.readers = [MCAPFile(x) for x in self.paths]
for reader in self.readers:
reader.open()
def close(self) -> None:
"""Close rosbag2."""
assert self.readers
for reader in self.readers:
reader.close()
self.readers = []
def get_definitions(self) -> dict[str, tuple[str, str]]:
"""Get message definitions."""
res = {}
for reader in self.readers:
res.update(reader.get_schema_definitions())
return res
def messages(
self,
connections: Iterable[Connection] = (),
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Read messages from bag.
Args:
connections: Iterable with connections to filter for. An empty
iterable disables filtering on connections.
start: Yield only messages at or after this timestamp (ns).
stop: Yield only messages before this timestamp (ns).
Yields:
tuples of connection, timestamp (ns), and rawdata.
"""
connections = list(connections) or list(self.connections)
for reader in self.readers:
yield from reader.messages(connections, start, stop)

View File

@ -0,0 +1,119 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Sqlite3 storage."""
from __future__ import annotations
import sqlite3
from typing import TYPE_CHECKING
from .errors import ReaderError
if TYPE_CHECKING:
from pathlib import Path
from typing import Any, Generator, Iterable, Optional
from rosbags.interfaces import Connection
class ReaderSqlite3:
"""Sqlite3 storage reader."""
def __init__(
self,
paths: Iterable[Path],
connections: Iterable[Connection],
):
"""Set up storage reader.
Args:
paths: Paths of storage files.
connections: List of connections.
"""
self.opened = False
self.paths = paths
self.connections = connections
def open(self) -> None:
"""Open rosbag2."""
self.opened = True
def close(self) -> None:
"""Close rosbag2."""
assert self.opened
self.opened = False
def get_definitions(self) -> dict[str, tuple[str, str]]:
"""Get message definitions."""
return {}
def messages( # pylint: disable=too-many-locals
self,
connections: Iterable[Connection] = (),
start: Optional[int] = None,
stop: Optional[int] = None,
) -> Generator[tuple[Connection, int, bytes], None, None]:
"""Read messages from bag.
Args:
connections: Iterable with connections to filter for. An empty
iterable disables filtering on connections.
start: Yield only messages at or after this timestamp (ns).
stop: Yield only messages before this timestamp (ns).
Yields:
tuples of connection, timestamp (ns), and rawdata.
Raises:
ReaderError: Bag not open.
"""
query = [
'SELECT topics.id,messages.timestamp,messages.data',
'FROM messages JOIN topics ON messages.topic_id=topics.id',
]
args: list[Any] = []
clause = 'WHERE'
if connections:
topics = {x.topic for x in connections}
query.append(f'{clause} topics.name IN ({",".join("?" for _ in topics)})')
args += topics
clause = 'AND'
if start is not None:
query.append(f'{clause} messages.timestamp >= ?')
args.append(start)
clause = 'AND'
if stop is not None:
query.append(f'{clause} messages.timestamp < ?')
args.append(stop)
clause = 'AND'
query.append('ORDER BY timestamp')
querystr = ' '.join(query)
for path in self.paths:
conn = sqlite3.connect(f'file:{path}?immutable=1', uri=True)
conn.row_factory = lambda _, x: x
cur = conn.cursor()
cur.execute(
'SELECT count(*) FROM sqlite_master '
'WHERE type="table" AND name IN ("messages", "topics")',
)
if cur.fetchone()[0] != 2:
raise ReaderError(f'Cannot open database {path} or database missing tables.')
cur.execute('SELECT name,id FROM topics')
connmap: dict[int, Connection] = {
row[1]: next((x for x in self.connections if x.topic == row[0]),
None) # type: ignore
for row in cur
}
cur.execute(querystr, args)
for cid, timestamp, data in cur:
yield connmap[cid], timestamp, data

View File

@ -0,0 +1,308 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag2 writer."""
from __future__ import annotations
import sqlite3
from enum import IntEnum, auto
from pathlib import Path
from typing import TYPE_CHECKING
import zstandard
from ruamel.yaml import YAML
from rosbags.interfaces import Connection, ConnectionExtRosbag2
if TYPE_CHECKING:
from types import TracebackType
from typing import Any, Literal, Optional, Type, Union
from .metadata import Metadata
class WriterError(Exception):
"""Writer Error."""
class Writer: # pylint: disable=too-many-instance-attributes
"""Rosbag2 writer.
This class implements writing of rosbag2 files in version 4. It should be
used as a contextmanager.
"""
SQLITE_SCHEMA = """
CREATE TABLE topics(
id INTEGER PRIMARY KEY,
name TEXT NOT NULL,
type TEXT NOT NULL,
serialization_format TEXT NOT NULL,
offered_qos_profiles TEXT NOT NULL
);
CREATE TABLE messages(
id INTEGER PRIMARY KEY,
topic_id INTEGER NOT NULL,
timestamp INTEGER NOT NULL,
data BLOB NOT NULL
);
CREATE INDEX timestamp_idx ON messages (timestamp ASC);
"""
class CompressionMode(IntEnum):
"""Compession modes."""
NONE = auto()
FILE = auto()
MESSAGE = auto()
class CompressionFormat(IntEnum):
"""Compession formats."""
ZSTD = auto()
def __init__(self, path: Union[Path, str]):
"""Initialize writer.
Args:
path: Filesystem path to bag.
Raises:
WriterError: Target path exisits already, Writer can only create new rosbags.
"""
path = Path(path)
self.path = path
if path.exists():
raise WriterError(f'{path} exists already, not overwriting.')
self.metapath = path / 'metadata.yaml'
self.dbpath = path / f'{path.name}.db3'
self.compression_mode = ''
self.compression_format = ''
self.compressor: Optional[zstandard.ZstdCompressor] = None
self.connections: list[Connection] = []
self.counts: dict[int, int] = {}
self.conn: Optional[sqlite3.Connection] = None
self.cursor: Optional[sqlite3.Cursor] = None
self.custom_data: dict[str, str] = {}
def set_compression(self, mode: CompressionMode, fmt: CompressionFormat) -> None:
"""Enable compression on bag.
This function has to be called before opening.
Args:
mode: Compression mode to use, either 'file' or 'message'.
fmt: Compressor to use, currently only 'zstd'.
Raises:
WriterError: Bag already open.
"""
if self.conn:
raise WriterError(f'Cannot set compression, bag {self.path} already open.')
if mode == self.CompressionMode.NONE:
return
self.compression_mode = mode.name.lower()
self.compression_format = fmt.name.lower()
self.compressor = zstandard.ZstdCompressor()
def set_custom_data(self, key: str, value: str) -> None:
"""Set key value pair in custom_data.
Args:
key: Key to set.
value: Value to set.
Raises:
WriterError: If value has incorrect type.
"""
if not isinstance(value, str):
raise WriterError(f'Cannot set non-string value {value!r} in custom_data.')
self.custom_data[key] = value
def open(self) -> None:
"""Open rosbag2 for writing.
Create base directory and open database connection.
"""
try:
self.path.mkdir(mode=0o755, parents=True)
except FileExistsError:
raise WriterError(f'{self.path} exists already, not overwriting.') from None
self.conn = sqlite3.connect(f'file:{self.dbpath}', uri=True)
self.conn.executescript(self.SQLITE_SCHEMA)
self.cursor = self.conn.cursor()
def add_connection(
self,
topic: str,
msgtype: str,
serialization_format: str = 'cdr',
offered_qos_profiles: str = '',
**_kw: Any, # noqa: ANN401
) -> Connection:
"""Add a connection.
This function can only be called after opening a bag.
Args:
topic: Topic name.
msgtype: Message type.
serialization_format: Serialization format.
offered_qos_profiles: QOS Profile.
_kw: Ignored to allow consuming dicts from connection objects.
Returns:
Connection object.
Raises:
WriterError: Bag not open or topic previously registered.
"""
if not self.cursor:
raise WriterError('Bag was not opened.')
connection = Connection(
id=len(self.connections) + 1,
topic=topic,
msgtype=msgtype,
msgdef='',
md5sum='',
msgcount=0,
ext=ConnectionExtRosbag2(
serialization_format=serialization_format,
offered_qos_profiles=offered_qos_profiles,
),
owner=self,
)
for conn in self.connections:
if (
conn.topic == connection.topic and conn.msgtype == connection.msgtype and
conn.ext == connection.ext
):
raise WriterError(f'Connection can only be added once: {connection!r}.')
self.connections.append(connection)
self.counts[connection.id] = 0
meta = (connection.id, topic, msgtype, serialization_format, offered_qos_profiles)
self.cursor.execute('INSERT INTO topics VALUES(?, ?, ?, ?, ?)', meta)
return connection
def write(self, connection: Connection, timestamp: int, data: bytes) -> None:
"""Write message to rosbag2.
Args:
connection: Connection to write message to.
timestamp: Message timestamp (ns).
data: Serialized message data.
Raises:
WriterError: Bag not open or topic not registered.
"""
if not self.cursor:
raise WriterError('Bag was not opened.')
if connection not in self.connections:
raise WriterError(f'Tried to write to unknown connection {connection!r}.')
if self.compression_mode == 'message':
assert self.compressor
data = self.compressor.compress(data)
self.cursor.execute(
'INSERT INTO messages (topic_id, timestamp, data) VALUES(?, ?, ?)',
(connection.id, timestamp, data),
)
self.counts[connection.id] += 1
def close(self) -> None:
"""Close rosbag2 after writing.
Closes open database transactions and writes metadata.yaml.
"""
assert self.cursor
assert self.conn
self.cursor.close()
self.cursor = None
duration, start, count = self.conn.execute(
'SELECT max(timestamp) - min(timestamp), min(timestamp), count(*) FROM messages',
).fetchone()
self.conn.commit()
self.conn.execute('PRAGMA optimize')
self.conn.close()
if self.compression_mode == 'file':
assert self.compressor
src = self.dbpath
self.dbpath = src.with_suffix(f'.db3.{self.compression_format}')
with src.open('rb') as infile, self.dbpath.open('wb') as outfile:
self.compressor.copy_stream(infile, outfile)
src.unlink()
metadata: dict[str, Metadata] = {
'rosbag2_bagfile_information': {
'version': 6,
'storage_identifier': 'sqlite3',
'relative_file_paths': [self.dbpath.name],
'duration': {
'nanoseconds': duration,
},
'starting_time': {
'nanoseconds_since_epoch': start,
},
'message_count': count,
'topics_with_message_count': [
{
'topic_metadata': {
'name': x.topic,
'type': x.msgtype,
'serialization_format': x.ext.serialization_format,
'offered_qos_profiles': x.ext.offered_qos_profiles,
},
'message_count': self.counts[x.id],
} for x in self.connections if isinstance(x.ext, ConnectionExtRosbag2)
],
'compression_format': self.compression_format,
'compression_mode': self.compression_mode,
'files': [
{
'path': self.dbpath.name,
'starting_time': {
'nanoseconds_since_epoch': start,
},
'duration': {
'nanoseconds': duration,
},
'message_count': count,
},
],
'custom_data': self.custom_data,
},
}
with self.metapath.open('w') as metafile:
yaml = YAML(typ='safe')
yaml.default_flow_style = False
yaml.dump(metadata, metafile)
def __enter__(self) -> Writer:
"""Open rosbag2 when entering contextmanager."""
self.open()
return self
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType],
) -> Literal[False]:
"""Close rosbag2 when exiting contextmanager."""
self.close()
return False

View File

@ -0,0 +1,29 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbags message serialization and deserialization.
Serializers and deserializers convert between python messages objects and
the common rosbag serialization formats. Computationally cheap functions
convert directly between different serialization formats.
"""
from .messages import SerdeError
from .serdes import (
cdr_to_ros1,
deserialize_cdr,
deserialize_ros1,
ros1_to_cdr,
serialize_cdr,
serialize_ros1,
)
__all__ = [
'SerdeError',
'cdr_to_ros1',
'deserialize_cdr',
'deserialize_ros1',
'ros1_to_cdr',
'serialize_cdr',
'serialize_ros1',
]

View File

@ -0,0 +1,461 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Code generators for CDR.
Common Data Representation `CDR`_ is the serialization format used by most ROS2
middlewares.
.. _CDR: https://www.omg.org/cgi-bin/doc?formal/02-06-51
"""
from __future__ import annotations
import sys
from itertools import tee
from typing import TYPE_CHECKING, Iterator, cast
from .typing import Field
from .utils import SIZEMAP, Valtype, align, align_after, compile_lines
if TYPE_CHECKING:
from .typing import CDRDeser, CDRSer, CDRSerSize
def generate_getsize_cdr(fields: list[Field]) -> tuple[CDRSerSize, int]:
"""Generate cdr size calculation function.
Args:
fields: Fields of message.
Returns:
Size calculation function and static size.
"""
# pylint: disable=too-many-branches,too-many-locals,too-many-nested-blocks,too-many-statements
size = 0
is_stat = True
aligned = 8
iterators = tee([*fields, None])
icurr = cast('Iterator[Field]', iterators[0])
inext = iterators[1]
next(inext)
lines = [
'import sys',
'from rosbags.serde.messages import get_msgdef',
'def getsize_cdr(pos, message, typestore):',
]
for fcurr, fnext in zip(icurr, inext):
fieldname, desc = fcurr
if desc.valtype == Valtype.MESSAGE:
if desc.args.size_cdr:
lines.append(f' pos += {desc.args.size_cdr}')
size += desc.args.size_cdr
else:
lines.append(f' func = get_msgdef("{desc.args.name}", typestore).getsize_cdr')
lines.append(f' pos = func(pos, message.{fieldname}, typestore)')
is_stat = False
aligned = align_after(desc)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(f' pos += 4 + len(message.{fieldname}.encode()) + 1')
aligned = 1
is_stat = False
else:
lines.append(f' pos += {SIZEMAP[desc.args]}')
aligned = SIZEMAP[desc.args]
size += SIZEMAP[desc.args]
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(f' val = message.{fieldname}')
for idx in range(length):
lines.append(' pos = (pos + 4 - 1) & -4')
lines.append(f' pos += 4 + len(val[{idx}].encode()) + 1')
aligned = 1
is_stat = False
else:
lines.append(f' pos += {length * SIZEMAP[subdesc.args]}')
size += length * SIZEMAP[subdesc.args]
else:
assert subdesc.valtype == Valtype.MESSAGE
anext_before = align(subdesc)
anext_after = align_after(subdesc)
if subdesc.args.size_cdr:
for _ in range(length):
if anext_before > anext_after:
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
size = (size + anext_before - 1) & -anext_before
lines.append(f' pos += {subdesc.args.size_cdr}')
size += subdesc.args.size_cdr
else:
lines.append(
f' func = get_msgdef("{subdesc.args.name}", typestore).getsize_cdr',
)
lines.append(f' val = message.{fieldname}')
for idx in range(length):
if anext_before > anext_after:
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(f' pos = func(pos, val[{idx}], typestore)')
is_stat = False
aligned = align_after(subdesc)
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(' pos += 4')
aligned = 4
subdesc = desc.args[0]
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(f' for val in message.{fieldname}:')
lines.append(' pos = (pos + 4 - 1) & -4')
lines.append(' pos += 4 + len(val.encode()) + 1')
aligned = 1
else:
anext_before = align(subdesc)
if aligned < anext_before:
lines.append(f' if len(message.{fieldname}):')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
aligned = anext_before
lines.append(f' pos += len(message.{fieldname}) * {SIZEMAP[subdesc.args]}')
else:
assert subdesc.valtype == Valtype.MESSAGE
anext_before = align(subdesc)
anext_after = align_after(subdesc)
lines.append(f' val = message.{fieldname}')
if subdesc.args.size_cdr:
if aligned < anext_before <= anext_after:
lines.append(' if len(val):')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(' for _ in val:')
if anext_before > anext_after:
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(f' pos += {subdesc.args.size_cdr}')
else:
lines.append(
f' func = get_msgdef("{subdesc.args.name}", typestore).getsize_cdr',
)
if aligned < anext_before <= anext_after:
lines.append(' if len(val):')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(' for item in val:')
if anext_before > anext_after:
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(' pos = func(pos, item, typestore)')
aligned = align_after(subdesc)
aligned = min([aligned, 4])
is_stat = False
if fnext and aligned < (anext_before := align(fnext.descriptor)):
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
aligned = anext_before
is_stat = False
lines.append(' return pos')
return compile_lines(lines).getsize_cdr, is_stat * size
def generate_serialize_cdr(fields: list[Field], endianess: str) -> CDRSer:
"""Generate cdr serialization function.
Args:
fields: Fields of message.
endianess: Endianess of rawdata.
Returns:
Serializer function.
"""
# pylint: disable=too-many-branches,too-many-locals,too-many-statements
aligned = 8
iterators = tee([*fields, None])
icurr = cast('Iterator[Field]', iterators[0])
inext = iterators[1]
next(inext)
lines = [
'import sys',
'import numpy',
'from rosbags.serde.messages import SerdeError, get_msgdef',
f'from rosbags.serde.primitives import pack_bool_{endianess}',
f'from rosbags.serde.primitives import pack_int8_{endianess}',
f'from rosbags.serde.primitives import pack_int16_{endianess}',
f'from rosbags.serde.primitives import pack_int32_{endianess}',
f'from rosbags.serde.primitives import pack_int64_{endianess}',
f'from rosbags.serde.primitives import pack_uint8_{endianess}',
f'from rosbags.serde.primitives import pack_uint16_{endianess}',
f'from rosbags.serde.primitives import pack_uint32_{endianess}',
f'from rosbags.serde.primitives import pack_uint64_{endianess}',
f'from rosbags.serde.primitives import pack_float32_{endianess}',
f'from rosbags.serde.primitives import pack_float64_{endianess}',
'def serialize_cdr(rawdata, pos, message, typestore):',
]
for fcurr, fnext in zip(icurr, inext):
fieldname, desc = fcurr
lines.append(f' val = message.{fieldname}')
if desc.valtype == Valtype.MESSAGE:
name = desc.args.name
lines.append(f' func = get_msgdef("{name}", typestore).serialize_cdr_{endianess}')
lines.append(' pos = func(rawdata, pos, val, typestore)')
aligned = align_after(desc)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(' bval = memoryview(val.encode())')
lines.append(' length = len(bval) + 1')
lines.append(f' pack_int32_{endianess}(rawdata, pos, length)')
lines.append(' pos += 4')
lines.append(' rawdata[pos:pos + length - 1] = bval')
lines.append(' pos += length')
aligned = 1
else:
lines.append(f' pack_{desc.args}_{endianess}(rawdata, pos, val)')
lines.append(f' pos += {SIZEMAP[desc.args]}')
aligned = SIZEMAP[desc.args]
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
lines.append(f' if len(val) != {length}:')
lines.append(' raise SerdeError(\'Unexpected array length\')')
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
for idx in range(length):
lines.append(f' bval = memoryview(val[{idx}].encode())')
lines.append(' length = len(bval) + 1')
lines.append(' pos = (pos + 4 - 1) & -4')
lines.append(f' pack_int32_{endianess}(rawdata, pos, length)')
lines.append(' pos += 4')
lines.append(' rawdata[pos:pos + length - 1] = bval')
lines.append(' pos += length')
aligned = 1
else:
if (endianess == 'le') != (sys.byteorder == 'little'):
lines.append(' val = val.byteswap()')
size = length * SIZEMAP[subdesc.args]
lines.append(f' rawdata[pos:pos + {size}] = val.view(numpy.uint8)')
lines.append(f' pos += {size}')
else:
assert subdesc.valtype == Valtype.MESSAGE
anext_before = align(subdesc)
anext_after = align_after(subdesc)
name = subdesc.args.name
lines.append(f' func = get_msgdef("{name}", typestore).serialize_cdr_{endianess}')
for idx in range(length):
if anext_before > anext_after:
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(f' pos = func(rawdata, pos, val[{idx}], typestore)')
aligned = align_after(subdesc)
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(f' pack_int32_{endianess}(rawdata, pos, len(val))')
lines.append(' pos += 4')
aligned = 4
subdesc = desc.args[0]
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' for item in val:')
lines.append(' bval = memoryview(item.encode())')
lines.append(' length = len(bval) + 1')
lines.append(' pos = (pos + 4 - 1) & -4')
lines.append(f' pack_int32_{endianess}(rawdata, pos, length)')
lines.append(' pos += 4')
lines.append(' rawdata[pos:pos + length - 1] = bval')
lines.append(' pos += length')
aligned = 1
else:
lines.append(f' size = len(val) * {SIZEMAP[subdesc.args]}')
if (endianess == 'le') != (sys.byteorder == 'little'):
lines.append(' val = val.byteswap()')
if aligned < (anext_before := align(subdesc)):
lines.append(' if size:')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(' rawdata[pos:pos + size] = val.view(numpy.uint8)')
lines.append(' pos += size')
aligned = anext_before
if subdesc.valtype == Valtype.MESSAGE:
anext_before = align(subdesc)
name = subdesc.args.name
lines.append(f' func = get_msgdef("{name}", typestore).serialize_cdr_{endianess}')
lines.append(' for item in val:')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(' pos = func(rawdata, pos, item, typestore)')
aligned = align_after(subdesc)
aligned = min([4, aligned])
if fnext and aligned < (anext_before := align(fnext.descriptor)):
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
aligned = anext_before
lines.append(' return pos')
return compile_lines(lines).serialize_cdr # type: ignore
def generate_deserialize_cdr(fields: list[Field], endianess: str) -> CDRDeser:
"""Generate cdr deserialization function.
Args:
fields: Fields of message.
endianess: Endianess of rawdata.
Returns:
Deserializer function.
"""
# pylint: disable=too-many-branches,too-many-locals,too-many-nested-blocks,too-many-statements
aligned = 8
iterators = tee([*fields, None])
icurr = cast('Iterator[Field]', iterators[0])
inext = iterators[1]
next(inext)
lines = [
'import sys',
'import numpy',
'from rosbags.serde.messages import SerdeError, get_msgdef',
f'from rosbags.serde.primitives import unpack_bool_{endianess}',
f'from rosbags.serde.primitives import unpack_int8_{endianess}',
f'from rosbags.serde.primitives import unpack_int16_{endianess}',
f'from rosbags.serde.primitives import unpack_int32_{endianess}',
f'from rosbags.serde.primitives import unpack_int64_{endianess}',
f'from rosbags.serde.primitives import unpack_uint8_{endianess}',
f'from rosbags.serde.primitives import unpack_uint16_{endianess}',
f'from rosbags.serde.primitives import unpack_uint32_{endianess}',
f'from rosbags.serde.primitives import unpack_uint64_{endianess}',
f'from rosbags.serde.primitives import unpack_float32_{endianess}',
f'from rosbags.serde.primitives import unpack_float64_{endianess}',
'def deserialize_cdr(rawdata, pos, cls, typestore):',
]
funcname = f'deserialize_cdr_{endianess}'
lines.append(' values = []')
for fcurr, fnext in zip(icurr, inext):
desc = fcurr[1]
if desc.valtype == Valtype.MESSAGE:
lines.append(f' msgdef = get_msgdef("{desc.args.name}", typestore)')
lines.append(f' obj, pos = msgdef.{funcname}(rawdata, pos, msgdef.cls, typestore)')
lines.append(' values.append(obj)')
aligned = align_after(desc)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(f' length = unpack_int32_{endianess}(rawdata, pos)[0]')
lines.append(' string = bytes(rawdata[pos + 4:pos + 4 + length - 1]).decode()')
lines.append(' values.append(string)')
lines.append(' pos += 4 + length')
aligned = 1
else:
lines.append(f' value = unpack_{desc.args}_{endianess}(rawdata, pos)[0]')
lines.append(' values.append(value)')
lines.append(f' pos += {SIZEMAP[desc.args]}')
aligned = SIZEMAP[desc.args]
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' value = []')
for idx in range(length):
if idx:
lines.append(' pos = (pos + 4 - 1) & -4')
lines.append(f' length = unpack_int32_{endianess}(rawdata, pos)[0]')
lines.append(
' value.append(bytes(rawdata[pos + 4:pos + 4 + length - 1]).decode())',
)
lines.append(' pos += 4 + length')
lines.append(' values.append(value)')
aligned = 1
else:
size = length * SIZEMAP[subdesc.args]
lines.append(
f' val = numpy.frombuffer(rawdata, '
f'dtype=numpy.{subdesc.args}, count={length}, offset=pos)',
)
if (endianess == 'le') != (sys.byteorder == 'little'):
lines.append(' val = val.byteswap()')
lines.append(' values.append(val)')
lines.append(f' pos += {size}')
else:
assert subdesc.valtype == Valtype.MESSAGE
anext_before = align(subdesc)
anext_after = align_after(subdesc)
lines.append(f' msgdef = get_msgdef("{subdesc.args.name}", typestore)')
lines.append(' value = []')
for _ in range(length):
if anext_before > anext_after:
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(
f' obj, pos = msgdef.{funcname}(rawdata, pos, msgdef.cls, typestore)',
)
lines.append(' value.append(obj)')
lines.append(' values.append(value)')
aligned = align_after(subdesc)
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(f' size = unpack_int32_{endianess}(rawdata, pos)[0]')
lines.append(' pos += 4')
aligned = 4
subdesc = desc.args[0]
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' value = []')
lines.append(' for _ in range(size):')
lines.append(' pos = (pos + 4 - 1) & -4')
lines.append(f' length = unpack_int32_{endianess}(rawdata, pos)[0]')
lines.append(
' value.append(bytes(rawdata[pos + 4:pos + 4 + length - 1])'
'.decode())',
)
lines.append(' pos += 4 + length')
lines.append(' values.append(value)')
aligned = 1
else:
lines.append(f' length = size * {SIZEMAP[subdesc.args]}')
if aligned < (anext_before := align(subdesc)):
lines.append(' if size:')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(
f' val = numpy.frombuffer(rawdata, '
f'dtype=numpy.{subdesc.args}, count=size, offset=pos)',
)
if (endianess == 'le') != (sys.byteorder == 'little'):
lines.append(' val = val.byteswap()')
lines.append(' values.append(val)')
lines.append(' pos += length')
aligned = anext_before
if subdesc.valtype == Valtype.MESSAGE:
anext_before = align(subdesc)
lines.append(f' msgdef = get_msgdef("{subdesc.args.name}", typestore)')
lines.append(' value = []')
lines.append(' for _ in range(size):')
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
lines.append(
f' obj, pos = msgdef.{funcname}(rawdata, pos, msgdef.cls, typestore)',
)
lines.append(' value.append(obj)')
lines.append(' values.append(value)')
aligned = align_after(subdesc)
aligned = min([4, aligned])
if fnext and aligned < (anext_before := align(fnext.descriptor)):
lines.append(f' pos = (pos + {anext_before} - 1) & -{anext_before}')
aligned = anext_before
lines.append(' return cls(*values), pos')
return compile_lines(lines).deserialize_cdr # type: ignore

View File

@ -0,0 +1,92 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Runtime message loader and cache."""
from __future__ import annotations
from typing import TYPE_CHECKING
from .cdr import generate_deserialize_cdr, generate_getsize_cdr, generate_serialize_cdr
from .ros1 import (
generate_cdr_to_ros1,
generate_deserialize_ros1,
generate_getsize_ros1,
generate_ros1_to_cdr,
generate_serialize_ros1,
)
from .typing import Descriptor, Field, Msgdef
from .utils import Valtype
if TYPE_CHECKING:
from rosbags.typesys.base import Fielddesc
from rosbags.typesys.register import Typestore
MSGDEFCACHE: dict[Typestore, dict[str, Msgdef]] = {}
class SerdeError(Exception):
"""Serialization and Deserialization Error."""
def get_msgdef(typename: str, typestore: Typestore) -> Msgdef:
"""Retrieve message definition for typename.
Message definitions are cached globally and generated as needed.
Args:
typename: Msgdef type name to load.
typestore: Type store.
Returns:
Message definition.
"""
if typestore not in MSGDEFCACHE:
MSGDEFCACHE[typestore] = {}
cache = MSGDEFCACHE[typestore]
if typename not in cache:
entries = typestore.FIELDDEFS[typename][1]
def fixup(entry: Fielddesc) -> Descriptor:
if entry[0] == Valtype.BASE:
assert isinstance(entry[1], str)
return Descriptor(Valtype.BASE, entry[1])
if entry[0] == Valtype.MESSAGE:
assert isinstance(entry[1], str)
return Descriptor(Valtype.MESSAGE, get_msgdef(entry[1], typestore))
if entry[0] == Valtype.ARRAY:
assert not isinstance(entry[1][0], str)
return Descriptor(Valtype.ARRAY, (fixup(entry[1][0]), entry[1][1]))
if entry[0] == Valtype.SEQUENCE:
assert not isinstance(entry[1][0], str)
return Descriptor(Valtype.SEQUENCE, (fixup(entry[1][0]), entry[1][1]))
raise SerdeError( # pragma: no cover
f'Unknown field type {entry[0]!r} encountered.',
)
fields = [Field(name, fixup(desc)) for name, desc in entries]
getsize_cdr, size_cdr = generate_getsize_cdr(fields)
getsize_ros1, size_ros1 = generate_getsize_ros1(fields, typename)
cache[typename] = Msgdef(
typename,
fields,
getattr(typestore, typename.replace('/', '__')),
size_cdr,
getsize_cdr,
generate_serialize_cdr(fields, 'le'),
generate_serialize_cdr(fields, 'be'),
generate_deserialize_cdr(fields, 'le'),
generate_deserialize_cdr(fields, 'be'),
size_ros1,
getsize_ros1,
generate_serialize_ros1(fields, typename),
generate_deserialize_ros1(fields, typename),
generate_ros1_to_cdr(fields, typename, False), # type: ignore
generate_ros1_to_cdr(fields, typename, True), # type: ignore
generate_cdr_to_ros1(fields, typename, False), # type: ignore
generate_cdr_to_ros1(fields, typename, True), # type: ignore
)
return cache[typename]

View File

@ -0,0 +1,55 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Serialization primitives.
These functions are used by generated code to serialize and desesialize
primitive values.
"""
from struct import Struct
pack_bool_le = Struct('?').pack_into
pack_int8_le = Struct('b').pack_into
pack_int16_le = Struct('<h').pack_into
pack_int32_le = Struct('<i').pack_into
pack_int64_le = Struct('<q').pack_into
pack_uint8_le = Struct('B').pack_into
pack_uint16_le = Struct('<H').pack_into
pack_uint32_le = Struct('<I').pack_into
pack_uint64_le = Struct('<Q').pack_into
pack_float32_le = Struct('<f').pack_into
pack_float64_le = Struct('<d').pack_into
unpack_bool_le = Struct('?').unpack_from
unpack_int8_le = Struct('b').unpack_from
unpack_int16_le = Struct('<h').unpack_from
unpack_int32_le = Struct('<i').unpack_from
unpack_int64_le = Struct('<q').unpack_from
unpack_uint8_le = Struct('B').unpack_from
unpack_uint16_le = Struct('<H').unpack_from
unpack_uint32_le = Struct('<I').unpack_from
unpack_uint64_le = Struct('<Q').unpack_from
unpack_float32_le = Struct('<f').unpack_from
unpack_float64_le = Struct('<d').unpack_from
pack_bool_be = Struct('?').pack_into
pack_int8_be = Struct('b').pack_into
pack_int16_be = Struct('>h').pack_into
pack_int32_be = Struct('>i').pack_into
pack_int64_be = Struct('>q').pack_into
pack_uint8_be = Struct('B').pack_into
pack_uint16_be = Struct('>H').pack_into
pack_uint32_be = Struct('>I').pack_into
pack_uint64_be = Struct('>Q').pack_into
pack_float32_be = Struct('>f').pack_into
pack_float64_be = Struct('>d').pack_into
unpack_bool_be = Struct('?').unpack_from
unpack_int8_be = Struct('b').unpack_from
unpack_int16_be = Struct('>h').unpack_from
unpack_int32_be = Struct('>i').unpack_from
unpack_int64_be = Struct('>q').unpack_from
unpack_uint8_be = Struct('B').unpack_from
unpack_uint16_be = Struct('>H').unpack_from
unpack_uint32_be = Struct('>I').unpack_from
unpack_uint64_be = Struct('>Q').unpack_from
unpack_float32_be = Struct('>f').unpack_from
unpack_float64_be = Struct('>d').unpack_from

View File

View File

@ -0,0 +1,680 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Code generators for ROS1.
`ROS1`_ uses a serialization format. This module supports fast byte-level
conversion of ROS1 to CDR.
.. _ROS1: http://wiki.ros.org/ROS/Technical%20Overview
"""
from __future__ import annotations
import sys
from itertools import tee
from typing import TYPE_CHECKING, Iterator, cast
from .typing import Field
from .utils import SIZEMAP, Valtype, align, align_after, compile_lines
if TYPE_CHECKING:
from typing import Union
from .typing import Bitcvt, BitcvtSize, CDRDeser, CDRSer, CDRSerSize
def generate_ros1_to_cdr(
fields: list[Field],
typename: str,
copy: bool,
) -> Union[Bitcvt, BitcvtSize]:
"""Generate ROS1 to CDR conversion function.
Args:
fields: Fields of message.
typename: Message type name.
copy: Generate conversion or sizing function.
Returns:
ROS1 to CDR conversion function.
"""
# pylint: disable=too-many-branches,too-many-locals,too-many-nested-blocks,too-many-statements
aligned = 8
iterators = tee([*fields, None])
icurr = cast('Iterator[Field]', iterators[0])
inext = iterators[1]
next(inext)
funcname = 'ros1_to_cdr' if copy else 'getsize_ros1_to_cdr'
lines = [
'import sys',
'import numpy',
'from rosbags.serde.messages import SerdeError, get_msgdef',
'from rosbags.serde.primitives import pack_int32_le',
'from rosbags.serde.primitives import unpack_int32_le',
f'def {funcname}(input, ipos, output, opos, typestore):',
]
if typename == 'std_msgs/msg/Header':
lines.append(' ipos += 4')
for fcurr, fnext in zip(icurr, inext):
_, desc = fcurr
if desc.valtype == Valtype.MESSAGE:
lines.append(f' func = get_msgdef("{desc.args.name}", typestore).{funcname}')
lines.append(' ipos, opos = func(input, ipos, output, opos, typestore)')
aligned = align_after(desc)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(' length = unpack_int32_le(input, ipos)[0] + 1')
if copy:
lines.append(' pack_int32_le(output, opos, length)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
if copy:
lines.append(' output[opos:opos + length - 1] = input[ipos:ipos + length - 1]')
lines.append(' ipos += length - 1')
lines.append(' opos += length')
aligned = 1
else:
size = SIZEMAP[desc.args]
if copy:
lines.append(f' output[opos:opos + {size}] = input[ipos:ipos + {size}]')
lines.append(f' ipos += {size}')
lines.append(f' opos += {size}')
aligned = size
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
for _ in range(length):
lines.append(' opos = (opos + 4 - 1) & -4')
lines.append(' length = unpack_int32_le(input, ipos)[0] + 1')
if copy:
lines.append(' pack_int32_le(output, opos, length)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
if copy:
lines.append(
' output[opos:opos + length - 1] = input[ipos:ipos + length - 1]',
)
lines.append(' ipos += length - 1')
lines.append(' opos += length')
aligned = 1
else:
size = length * SIZEMAP[subdesc.args]
if copy:
lines.append(f' output[opos:opos + {size}] = input[ipos:ipos + {size}]')
lines.append(f' ipos += {size}')
lines.append(f' opos += {size}')
aligned = SIZEMAP[subdesc.args]
if subdesc.valtype == Valtype.MESSAGE:
anext_before = align(subdesc)
anext_after = align_after(subdesc)
lines.append(f' func = get_msgdef("{subdesc.args.name}", typestore).{funcname}')
for _ in range(length):
if anext_before > anext_after:
lines.append(f' opos = (opos + {anext_before} - 1) & -{anext_before}')
lines.append(' ipos, opos = func(input, ipos, output, opos, typestore)')
aligned = anext_after
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(' size = unpack_int32_le(input, ipos)[0]')
if copy:
lines.append(' pack_int32_le(output, opos, size)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
subdesc = desc.args[0]
aligned = 4
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' for _ in range(size):')
lines.append(' length = unpack_int32_le(input, ipos)[0] + 1')
lines.append(' opos = (opos + 4 - 1) & -4')
if copy:
lines.append(' pack_int32_le(output, opos, length)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
if copy:
lines.append(
' output[opos:opos + length - 1] = input[ipos:ipos + length - 1]',
)
lines.append(' ipos += length - 1')
lines.append(' opos += length')
aligned = 1
else:
if aligned < (anext_before := align(subdesc)):
lines.append(' if size:')
lines.append(f' opos = (opos + {anext_before} - 1) & -{anext_before}')
lines.append(f' length = size * {SIZEMAP[subdesc.args]}')
if copy:
lines.append(' output[opos:opos + length] = input[ipos:ipos + length]')
lines.append(' ipos += length')
lines.append(' opos += length')
aligned = anext_before
else:
assert subdesc.valtype == Valtype.MESSAGE
anext_before = align(subdesc)
lines.append(f' func = get_msgdef("{subdesc.args.name}", typestore).{funcname}')
lines.append(' for _ in range(size):')
lines.append(f' opos = (opos + {anext_before} - 1) & -{anext_before}')
lines.append(' ipos, opos = func(input, ipos, output, opos, typestore)')
aligned = align_after(subdesc)
aligned = min([aligned, 4])
if fnext and aligned < (anext_before := align(fnext.descriptor)):
lines.append(f' opos = (opos + {anext_before} - 1) & -{anext_before}')
aligned = anext_before
lines.append(' return ipos, opos')
return getattr(compile_lines(lines), funcname) # type: ignore
def generate_cdr_to_ros1(
fields: list[Field],
typename: str,
copy: bool,
) -> Union[Bitcvt, BitcvtSize]:
"""Generate CDR to ROS1 conversion function.
Args:
fields: Fields of message.
typename: Message type name.
copy: Generate conversion or sizing function.
Returns:
CDR to ROS1 conversion function.
"""
# pylint: disable=too-many-branches,too-many-locals,too-many-nested-blocks,too-many-statements
aligned = 8
iterators = tee([*fields, None])
icurr = cast('Iterator[Field]', iterators[0])
inext = iterators[1]
next(inext)
funcname = 'cdr_to_ros1' if copy else 'getsize_cdr_to_ros1'
lines = [
'import sys',
'import numpy',
'from rosbags.serde.messages import SerdeError, get_msgdef',
'from rosbags.serde.primitives import pack_int32_le',
'from rosbags.serde.primitives import unpack_int32_le',
f'def {funcname}(input, ipos, output, opos, typestore):',
]
if typename == 'std_msgs/msg/Header':
lines.append(' opos += 4')
for fcurr, fnext in zip(icurr, inext):
_, desc = fcurr
if desc.valtype == Valtype.MESSAGE:
lines.append(f' func = get_msgdef("{desc.args.name}", typestore).{funcname}')
lines.append(' ipos, opos = func(input, ipos, output, opos, typestore)')
aligned = align_after(desc)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(' length = unpack_int32_le(input, ipos)[0] - 1')
if copy:
lines.append(' pack_int32_le(output, opos, length)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
if copy:
lines.append(' output[opos:opos + length] = input[ipos:ipos + length]')
lines.append(' ipos += length + 1')
lines.append(' opos += length')
aligned = 1
else:
size = SIZEMAP[desc.args]
if copy:
lines.append(f' output[opos:opos + {size}] = input[ipos:ipos + {size}]')
lines.append(f' ipos += {size}')
lines.append(f' opos += {size}')
aligned = size
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
for _ in range(length):
lines.append(' ipos = (ipos + 4 - 1) & -4')
lines.append(' length = unpack_int32_le(input, ipos)[0] - 1')
if copy:
lines.append(' pack_int32_le(output, opos, length)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
if copy:
lines.append(
' output[opos:opos + length] = input[ipos:ipos + length]',
)
lines.append(' ipos += length + 1')
lines.append(' opos += length')
aligned = 1
else:
size = length * SIZEMAP[subdesc.args]
if copy:
lines.append(f' output[opos:opos + {size}] = input[ipos:ipos + {size}]')
lines.append(f' ipos += {size}')
lines.append(f' opos += {size}')
aligned = SIZEMAP[subdesc.args]
if subdesc.valtype == Valtype.MESSAGE:
anext_before = align(subdesc)
anext_after = align_after(subdesc)
lines.append(f' func = get_msgdef("{subdesc.args.name}", typestore).{funcname}')
for _ in range(length):
if anext_before > anext_after:
lines.append(f' ipos = (ipos + {anext_before} - 1) & -{anext_before}')
lines.append(' ipos, opos = func(input, ipos, output, opos, typestore)')
aligned = anext_after
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(' size = unpack_int32_le(input, ipos)[0]')
if copy:
lines.append(' pack_int32_le(output, opos, size)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
subdesc = desc.args[0]
aligned = 4
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' for _ in range(size):')
lines.append(' ipos = (ipos + 4 - 1) & -4')
lines.append(' length = unpack_int32_le(input, ipos)[0] - 1')
if copy:
lines.append(' pack_int32_le(output, opos, length)')
lines.append(' ipos += 4')
lines.append(' opos += 4')
if copy:
lines.append(' output[opos:opos + length] = input[ipos:ipos + length]')
lines.append(' ipos += length + 1')
lines.append(' opos += length')
aligned = 1
else:
if aligned < (anext_before := align(subdesc)):
lines.append(' if size:')
lines.append(f' ipos = (ipos + {anext_before} - 1) & -{anext_before}')
lines.append(f' length = size * {SIZEMAP[subdesc.args]}')
if copy:
lines.append(' output[opos:opos + length] = input[ipos:ipos + length]')
lines.append(' ipos += length')
lines.append(' opos += length')
aligned = anext_before
else:
assert subdesc.valtype == Valtype.MESSAGE
anext_before = align(subdesc)
lines.append(f' func = get_msgdef("{subdesc.args.name}", typestore).{funcname}')
lines.append(' for _ in range(size):')
lines.append(f' ipos = (ipos + {anext_before} - 1) & -{anext_before}')
lines.append(' ipos, opos = func(input, ipos, output, opos, typestore)')
aligned = align_after(subdesc)
aligned = min([aligned, 4])
if fnext and aligned < (anext_before := align(fnext.descriptor)):
lines.append(f' ipos = (ipos + {anext_before} - 1) & -{anext_before}')
aligned = anext_before
lines.append(' return ipos, opos')
return getattr(compile_lines(lines), funcname) # type: ignore
def generate_getsize_ros1(fields: list[Field], typename: str) -> tuple[CDRSerSize, int]:
"""Generate ros1 size calculation function.
Args:
fields: Fields of message.
typename: Message type name.
Returns:
Size calculation function and static size.
"""
# pylint: disable=too-many-branches,too-many-statements
size = 0
is_stat = True
lines = [
'import sys',
'from rosbags.serde.messages import get_msgdef',
'def getsize_ros1(pos, message, typestore):',
]
if typename == 'std_msgs/msg/Header':
lines.append(' pos += 4')
for fcurr in fields:
fieldname, desc = fcurr
if desc.valtype == Valtype.MESSAGE:
if desc.args.size_ros1:
lines.append(f' pos += {desc.args.size_ros1}')
size += desc.args.size_ros1
else:
lines.append(f' func = get_msgdef("{desc.args.name}", typestore).getsize_ros1')
lines.append(f' pos = func(pos, message.{fieldname}, typestore)')
is_stat = False
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(f' pos += 4 + len(message.{fieldname}.encode())')
is_stat = False
else:
lines.append(f' pos += {SIZEMAP[desc.args]}')
size += SIZEMAP[desc.args]
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(f' val = message.{fieldname}')
for idx in range(length):
lines.append(f' pos += 4 + len(val[{idx}].encode())')
is_stat = False
else:
lines.append(f' pos += {length * SIZEMAP[subdesc.args]}')
size += length * SIZEMAP[subdesc.args]
else:
assert subdesc.valtype == Valtype.MESSAGE
if subdesc.args.size_ros1:
for _ in range(length):
lines.append(f' pos += {subdesc.args.size_ros1}')
size += subdesc.args.size_ros1
else:
lines.append(
f' func = get_msgdef("{subdesc.args.name}", typestore).getsize_ros1',
)
lines.append(f' val = message.{fieldname}')
for idx in range(length):
lines.append(f' pos = func(pos, val[{idx}], typestore)')
is_stat = False
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(' pos += 4')
subdesc = desc.args[0]
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(f' for val in message.{fieldname}:')
lines.append(' pos += 4 + len(val.encode())')
else:
lines.append(f' pos += len(message.{fieldname}) * {SIZEMAP[subdesc.args]}')
else:
assert subdesc.valtype == Valtype.MESSAGE
lines.append(f' val = message.{fieldname}')
if subdesc.args.size_ros1:
lines.append(f' pos += {subdesc.args.size_ros1} * len(val)')
else:
lines.append(
f' func = get_msgdef("{subdesc.args.name}", typestore).getsize_ros1',
)
lines.append(' for item in val:')
lines.append(' pos = func(pos, item, typestore)')
is_stat = False
lines.append(' return pos')
return compile_lines(lines).getsize_ros1, is_stat * size
def generate_serialize_ros1(fields: list[Field], typename: str) -> CDRSer:
"""Generate ros1 serialization function.
Args:
fields: Fields of message.
typename: Message type name.
Returns:
Serializer function.
"""
# pylint: disable=too-many-branches,too-many-statements
lines = [
'import sys',
'import numpy',
'from rosbags.serde.messages import SerdeError, get_msgdef',
'from rosbags.serde.primitives import pack_bool_le',
'from rosbags.serde.primitives import pack_int8_le',
'from rosbags.serde.primitives import pack_int16_le',
'from rosbags.serde.primitives import pack_int32_le',
'from rosbags.serde.primitives import pack_int64_le',
'from rosbags.serde.primitives import pack_uint8_le',
'from rosbags.serde.primitives import pack_uint16_le',
'from rosbags.serde.primitives import pack_uint32_le',
'from rosbags.serde.primitives import pack_uint64_le',
'from rosbags.serde.primitives import pack_float32_le',
'from rosbags.serde.primitives import pack_float64_le',
'def serialize_ros1(rawdata, pos, message, typestore):',
]
if typename == 'std_msgs/msg/Header':
lines.append(' pos += 4')
be_syms = ('>',) if sys.byteorder == 'little' else ('=', '>')
for fcurr in fields:
fieldname, desc = fcurr
lines.append(f' val = message.{fieldname}')
if desc.valtype == Valtype.MESSAGE:
name = desc.args.name
lines.append(f' func = get_msgdef("{name}", typestore).serialize_ros1')
lines.append(' pos = func(rawdata, pos, val, typestore)')
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(' bval = memoryview(val.encode())')
lines.append(' length = len(bval)')
lines.append(' pack_int32_le(rawdata, pos, length)')
lines.append(' pos += 4')
lines.append(' rawdata[pos:pos + length] = bval')
lines.append(' pos += length')
else:
lines.append(f' pack_{desc.args}_le(rawdata, pos, val)')
lines.append(f' pos += {SIZEMAP[desc.args]}')
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
lines.append(f' if len(val) != {length}:')
lines.append(' raise SerdeError(\'Unexpected array length\')')
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
for idx in range(length):
lines.append(f' bval = memoryview(val[{idx}].encode())')
lines.append(' length = len(bval)')
lines.append(' pack_int32_le(rawdata, pos, length)')
lines.append(' pos += 4')
lines.append(' rawdata[pos:pos + length] = bval')
lines.append(' pos += length')
else:
lines.append(f' if val.dtype.byteorder in {be_syms}:')
lines.append(' val = val.byteswap()')
size = length * SIZEMAP[subdesc.args]
lines.append(f' rawdata[pos:pos + {size}] = val.view(numpy.uint8)')
lines.append(f' pos += {size}')
else:
assert subdesc.valtype == Valtype.MESSAGE
name = subdesc.args.name
lines.append(f' func = get_msgdef("{name}", typestore).serialize_ros1')
for idx in range(length):
lines.append(f' pos = func(rawdata, pos, val[{idx}], typestore)')
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(' pack_int32_le(rawdata, pos, len(val))')
lines.append(' pos += 4')
subdesc = desc.args[0]
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' for item in val:')
lines.append(' bval = memoryview(item.encode())')
lines.append(' length = len(bval)')
lines.append(' pack_int32_le(rawdata, pos, length)')
lines.append(' pos += 4')
lines.append(' rawdata[pos:pos + length] = bval')
lines.append(' pos += length')
else:
lines.append(f' size = len(val) * {SIZEMAP[subdesc.args]}')
lines.append(f' if val.dtype.byteorder in {be_syms}:')
lines.append(' val = val.byteswap()')
lines.append(' rawdata[pos:pos + size] = val.view(numpy.uint8)')
lines.append(' pos += size')
if subdesc.valtype == Valtype.MESSAGE:
name = subdesc.args.name
lines.append(f' func = get_msgdef("{name}", typestore).serialize_ros1')
lines.append(' for item in val:')
lines.append(' pos = func(rawdata, pos, item, typestore)')
lines.append(' return pos')
return compile_lines(lines).serialize_ros1 # type: ignore
def generate_deserialize_ros1(fields: list[Field], typename: str) -> CDRDeser:
"""Generate ros1 deserialization function.
Args:
fields: Fields of message.
typename: Message type name.
Returns:
Deserializer function.
"""
# pylint: disable=too-many-branches,too-many-statements
lines = [
'import sys',
'import numpy',
'from rosbags.serde.messages import SerdeError, get_msgdef',
'from rosbags.serde.primitives import unpack_bool_le',
'from rosbags.serde.primitives import unpack_int8_le',
'from rosbags.serde.primitives import unpack_int16_le',
'from rosbags.serde.primitives import unpack_int32_le',
'from rosbags.serde.primitives import unpack_int64_le',
'from rosbags.serde.primitives import unpack_uint8_le',
'from rosbags.serde.primitives import unpack_uint16_le',
'from rosbags.serde.primitives import unpack_uint32_le',
'from rosbags.serde.primitives import unpack_uint64_le',
'from rosbags.serde.primitives import unpack_float32_le',
'from rosbags.serde.primitives import unpack_float64_le',
'def deserialize_ros1(rawdata, pos, cls, typestore):',
]
if typename == 'std_msgs/msg/Header':
lines.append(' pos += 4')
be_syms = ('>',) if sys.byteorder == 'little' else ('=', '>')
funcname = 'deserialize_ros1'
lines.append(' values = []')
for fcurr in fields:
desc = fcurr[1]
if desc.valtype == Valtype.MESSAGE:
lines.append(f' msgdef = get_msgdef("{desc.args.name}", typestore)')
lines.append(f' obj, pos = msgdef.{funcname}(rawdata, pos, msgdef.cls, typestore)')
lines.append(' values.append(obj)')
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
lines.append(' length = unpack_int32_le(rawdata, pos)[0]')
lines.append(' string = bytes(rawdata[pos + 4:pos + 4 + length]).decode()')
lines.append(' values.append(string)')
lines.append(' pos += 4 + length')
else:
lines.append(f' value = unpack_{desc.args}_le(rawdata, pos)[0]')
lines.append(' values.append(value)')
lines.append(f' pos += {SIZEMAP[desc.args]}')
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' value = []')
for _ in range(length):
lines.append(' length = unpack_int32_le(rawdata, pos)[0]')
lines.append(
' value.append(bytes(rawdata[pos + 4:pos + 4 + length]).decode())',
)
lines.append(' pos += 4 + length')
lines.append(' values.append(value)')
else:
size = length * SIZEMAP[subdesc.args]
lines.append(
f' val = numpy.frombuffer(rawdata, '
f'dtype=numpy.{subdesc.args}, count={length}, offset=pos)',
)
lines.append(f' if val.dtype.byteorder in {be_syms}:')
lines.append(' val = val.byteswap()')
lines.append(' values.append(val)')
lines.append(f' pos += {size}')
else:
assert subdesc.valtype == Valtype.MESSAGE
lines.append(f' msgdef = get_msgdef("{subdesc.args.name}", typestore)')
lines.append(' value = []')
for _ in range(length):
lines.append(
f' obj, pos = msgdef.{funcname}(rawdata, pos, msgdef.cls, typestore)',
)
lines.append(' value.append(obj)')
lines.append(' values.append(value)')
else:
assert desc.valtype == Valtype.SEQUENCE
lines.append(' size = unpack_int32_le(rawdata, pos)[0]')
lines.append(' pos += 4')
subdesc = desc.args[0]
if subdesc.valtype == Valtype.BASE:
if subdesc.args == 'string':
lines.append(' value = []')
lines.append(' for _ in range(size):')
lines.append(' length = unpack_int32_le(rawdata, pos)[0]')
lines.append(
' value.append(bytes(rawdata[pos + 4:pos + 4 + length])'
'.decode())',
)
lines.append(' pos += 4 + length')
lines.append(' values.append(value)')
else:
lines.append(f' length = size * {SIZEMAP[subdesc.args]}')
lines.append(
f' val = numpy.frombuffer(rawdata, '
f'dtype=numpy.{subdesc.args}, count=size, offset=pos)',
)
lines.append(f' if val.dtype.byteorder in {be_syms}:')
lines.append(' val = val.byteswap()')
lines.append(' values.append(val)')
lines.append(' pos += length')
if subdesc.valtype == Valtype.MESSAGE:
lines.append(f' msgdef = get_msgdef("{subdesc.args.name}", typestore)')
lines.append(' value = []')
lines.append(' for _ in range(size):')
lines.append(
f' obj, pos = msgdef.{funcname}(rawdata, pos, msgdef.cls, typestore)',
)
lines.append(' value.append(obj)')
lines.append(' values.append(value)')
lines.append(' return cls(*values), pos')
return compile_lines(lines).deserialize_ros1 # type: ignore

View File

@ -0,0 +1,208 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Serialization, deserializion and conversion functions."""
from __future__ import annotations
import sys
from struct import pack_into
from typing import TYPE_CHECKING
from rosbags.typesys import types
from .messages import get_msgdef
if TYPE_CHECKING:
from typing import Any
from rosbags.typesys.register import Typestore
def deserialize_cdr(
rawdata: bytes,
typename: str,
typestore: Typestore = types,
) -> Any: # noqa: ANN401
"""Deserialize raw data into a message object.
Args:
rawdata: Serialized data.
typename: Message type name.
typestore: Type store.
Returns:
Deserialized message object.
"""
little_endian = bool(rawdata[1])
msgdef = get_msgdef(typename, typestore)
func = msgdef.deserialize_cdr_le if little_endian else msgdef.deserialize_cdr_be
message, pos = func(rawdata[4:], 0, msgdef.cls, typestore)
assert pos + 4 + 3 >= len(rawdata)
return message
def serialize_cdr(
message: object,
typename: str,
little_endian: bool = sys.byteorder == 'little',
typestore: Typestore = types,
) -> memoryview:
"""Serialize message object to bytes.
Args:
message: Message object.
typename: Message type name.
little_endian: Should use little endianess.
typestore: Type store.
Returns:
Serialized bytes.
"""
msgdef = get_msgdef(typename, typestore)
size = 4 + msgdef.getsize_cdr(0, message, typestore)
rawdata = memoryview(bytearray(size))
pack_into('BB', rawdata, 0, 0, little_endian)
func = msgdef.serialize_cdr_le if little_endian else msgdef.serialize_cdr_be
pos = func(rawdata[4:], 0, message, typestore)
assert pos + 4 == size
return rawdata.toreadonly()
def deserialize_ros1(
rawdata: bytes,
typename: str,
typestore: Typestore = types,
) -> Any: # noqa: ANN401
"""Deserialize raw data into a message object.
Args:
rawdata: Serialized data.
typename: Message type name.
typestore: Type store.
Returns:
Deserialized message object.
"""
msgdef = get_msgdef(typename, typestore)
func = msgdef.deserialize_ros1
message, pos = func(rawdata, 0, msgdef.cls, typestore)
assert pos == len(rawdata)
return message
def serialize_ros1(
message: object,
typename: str,
typestore: Typestore = types,
) -> memoryview:
"""Serialize message object to bytes.
Args:
message: Message object.
typename: Message type name.
typestore: Type store.
Returns:
Serialized bytes.
"""
msgdef = get_msgdef(typename, typestore)
size = msgdef.getsize_ros1(0, message, typestore)
rawdata = memoryview(bytearray(size))
func = msgdef.serialize_ros1
pos = func(rawdata, 0, message, typestore)
assert pos == size
return rawdata.toreadonly()
def ros1_to_cdr(raw: bytes, typename: str, typestore: Typestore = types) -> memoryview:
"""Convert serialized ROS1 message directly to CDR.
This should be reasonably fast as conversions happen on a byte-level
without going through deserialization and serialization.
Args:
raw: ROS1 serialized message.
typename: Message type name.
typestore: Type store.
Returns:
CDR serialized message.
"""
msgdef = get_msgdef(typename, typestore)
ipos, opos = msgdef.getsize_ros1_to_cdr(
raw,
0,
None,
0,
typestore,
)
assert ipos == len(raw)
raw = memoryview(raw)
size = 4 + opos
rawdata = memoryview(bytearray(size))
pack_into('BB', rawdata, 0, 0, True)
ipos, opos = msgdef.ros1_to_cdr(
raw,
0,
rawdata[4:],
0,
typestore,
)
assert ipos == len(raw)
assert opos + 4 == size
return rawdata.toreadonly()
def cdr_to_ros1(raw: bytes, typename: str, typestore: Typestore = types) -> memoryview:
"""Convert serialized CDR message directly to ROS1.
This should be reasonably fast as conversions happen on a byte-level
without going through deserialization and serialization.
Args:
raw: CDR serialized message.
typename: Message type name.
typestore: Type store.
Returns:
ROS1 serialized message.
"""
assert raw[1] == 1, 'Message byte order is not little endian'
msgdef = get_msgdef(typename, typestore)
ipos, opos = msgdef.getsize_cdr_to_ros1(
raw[4:],
0,
None,
0,
typestore,
)
assert ipos + 4 + 3 >= len(raw)
raw = memoryview(raw)
size = opos
rawdata = memoryview(bytearray(size))
ipos, opos = msgdef.cdr_to_ros1(
raw[4:],
0,
rawdata,
0,
typestore,
)
assert ipos + 4 + 3 >= len(raw)
assert opos == size
return rawdata.toreadonly()

View File

@ -0,0 +1,55 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Python types used in this package."""
from __future__ import annotations
from typing import TYPE_CHECKING, NamedTuple
if TYPE_CHECKING:
from typing import Any, Callable, Tuple
from rosbags.typesys.register import Typestore
Bitcvt = Callable[[bytes, int, bytes, int, Typestore], Tuple[int, int]]
BitcvtSize = Callable[[bytes, int, None, int, Typestore], Tuple[int, int]]
CDRDeser = Callable[[bytes, int, type, Typestore], Tuple[Any, int]]
CDRSer = Callable[[bytes, int, object, Typestore], int]
CDRSerSize = Callable[[int, object, Typestore], int]
class Descriptor(NamedTuple):
"""Value type descriptor."""
valtype: int
args: Any
class Field(NamedTuple):
"""Metadata of a field."""
name: str
descriptor: Descriptor
class Msgdef(NamedTuple):
"""Metadata of a message."""
name: str
fields: list[Field]
cls: Any
size_cdr: int
getsize_cdr: CDRSerSize
serialize_cdr_le: CDRSer
serialize_cdr_be: CDRSer
deserialize_cdr_le: CDRDeser
deserialize_cdr_be: CDRDeser
size_ros1: int
getsize_ros1: CDRSerSize
serialize_ros1: CDRSer
deserialize_ros1: CDRDeser
getsize_ros1_to_cdr: BitcvtSize
ros1_to_cdr: Bitcvt
getsize_cdr_to_ros1: BitcvtSize
cdr_to_ros1: Bitcvt

View File

@ -0,0 +1,99 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Helpers used by code generators."""
from __future__ import annotations
from enum import IntEnum
from importlib.util import module_from_spec, spec_from_loader
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from types import ModuleType
from .typing import Descriptor
class Valtype(IntEnum):
"""Msg field value types."""
BASE = 1
MESSAGE = 2
ARRAY = 3
SEQUENCE = 4
SIZEMAP: dict[str, int] = {
'bool': 1,
'int8': 1,
'int16': 2,
'int32': 4,
'int64': 8,
'uint8': 1,
'uint16': 2,
'uint32': 4,
'uint64': 8,
'float32': 4,
'float64': 8,
}
def align(entry: Descriptor) -> int:
"""Get alignment requirement for entry.
Args:
entry: Field.
Returns:
Required alignment in bytes.
"""
if entry.valtype == Valtype.BASE:
if entry.args == 'string':
return 4
return SIZEMAP[entry.args]
if entry.valtype == Valtype.MESSAGE:
return align(entry.args.fields[0].descriptor)
if entry.valtype == Valtype.ARRAY:
return align(entry.args[0])
assert entry.valtype == Valtype.SEQUENCE
return 4
def align_after(entry: Descriptor) -> int:
"""Get alignment after entry.
Args:
entry: Field.
Returns:
Memory alignment after entry.
"""
if entry.valtype == Valtype.BASE:
if entry.args == 'string':
return 1
return SIZEMAP[entry.args]
if entry.valtype == Valtype.MESSAGE:
return align_after(entry.args.fields[-1].descriptor)
if entry.valtype == Valtype.ARRAY:
return align_after(entry.args[0])
assert entry.valtype == Valtype.SEQUENCE
return min([4, align_after(entry.args[0])])
def compile_lines(lines: list[str]) -> ModuleType:
"""Compile lines of code to module.
Args:
lines: Lines of python code.
Returns:
Compiled and loaded module.
"""
spec = spec_from_loader('tmpmod', loader=None)
assert spec
module = module_from_spec(spec)
exec('\n'.join(lines), module.__dict__) # pylint: disable=exec-used
return module

View File

@ -0,0 +1,30 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbags Type System.
The type system manages ROS message types and ships all standard ROS2
distribution message types by default. The system supports custom message
types through parsers that dynamically parse custom message definitons
from different source formats.
Supported formats:
- IDL files (subset of the standard necessary for parsing ROS2 IDL) `[1]`_
- MSG files `[2]`_
.. _[1]: https://www.omg.org/spec/IDL/About-IDL/
.. _[2]: http://wiki.ros.org/msg
"""
from .base import TypesysError
from .idl import get_types_from_idl
from .msg import generate_msgdef, get_types_from_msg
from .register import register_types
__all__ = [
'TypesysError',
'generate_msgdef',
'get_types_from_idl',
'get_types_from_msg',
'register_types',
]

View File

@ -0,0 +1,62 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Tool to update builtin types shipped with rosbags."""
from __future__ import annotations
from itertools import groupby
from os import walk
from pathlib import Path
from typing import TYPE_CHECKING
from .idl import get_types_from_idl
from .msg import get_types_from_msg
from .register import generate_python_code, register_types
if TYPE_CHECKING:
from .base import Typesdict
def generate_docs(typs: Typesdict) -> str:
"""Generate types documentation."""
res = []
for namespace, msgs in groupby([x.split('/msg/') for x in typs], key=lambda x: x[0]):
res.append(namespace)
res.append('*' * len(namespace))
for _, msg in msgs:
res.append(f'- :py:class:`{msg} <rosbags.typesys.types.{namespace}__msg__{msg}>`')
res.append('')
return '\n'.join(res)
def main() -> None: # pragma: no cover
"""Update builtin types.
Discover message definitions in filesystem and generate types.py module.
"""
typs: Typesdict = {}
selfdir = Path(__file__).parent
projectdir = selfdir.parent.parent.parent
for root, dirnames, files in walk(selfdir.parents[2] / 'tools' / 'messages'):
if '.rosbags_ignore' in files:
dirnames.clear()
continue
for fname in files:
path = Path(root, fname)
if path.suffix == '.idl':
typs.update(get_types_from_idl(path.read_text(encoding='utf-8')))
elif path.suffix == '.msg':
name = path.relative_to(path.parents[2]).with_suffix('')
if '/msg/' not in str(name):
name = name.parent / 'msg' / name.name
typs.update(get_types_from_msg(path.read_text(encoding='utf-8'), str(name)))
typs = dict(sorted(typs.items()))
register_types(typs)
(selfdir / 'types.py').write_text(generate_python_code(typs))
(projectdir / 'docs' / 'topics' / 'typesys-types.rst').write_text(generate_docs(typs))
if __name__ == '__main__':
main()

View File

@ -0,0 +1,72 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Types and helpers used by message definition converters."""
from __future__ import annotations
from enum import IntEnum, auto
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from typing import Any, Dict, List, Optional, Tuple, Union
from .peg import Visitor
Constdefs = List[Tuple[str, str, Any]]
Fielddesc = Tuple[int, Union[str, Tuple[Tuple[int, str], Optional[int]]]]
Fielddefs = List[Tuple[str, Fielddesc]]
Typesdict = Dict[str, Tuple[Constdefs, Fielddefs]]
class TypesysError(Exception):
"""Parser error."""
class Nodetype(IntEnum):
"""Parse tree node types.
The first four match the Valtypes of final message definitions.
"""
BASE = auto()
NAME = auto()
ARRAY = auto()
SEQUENCE = auto()
LITERAL_STRING = auto()
LITERAL_NUMBER = auto()
LITERAL_BOOLEAN = auto()
LITERAL_CHAR = auto()
MODULE = auto()
CONST = auto()
STRUCT = auto()
SDECLARATOR = auto()
ADECLARATOR = auto()
ANNOTATION = auto()
EXPRESSION_BINARY = auto()
EXPRESSION_UNARY = auto()
def parse_message_definition(visitor: Visitor, text: str) -> Typesdict:
"""Parse message definition.
Args:
visitor: Visitor instance to use.
text: Message definition.
Returns:
Parsetree of message.
Raises:
TypesysError: Message parsing failed.
"""
try:
rule = visitor.RULES['specification']
pos = rule.skip_ws(text, 0)
npos, trees = rule.parse(text, pos)
assert npos == len(text), f'Could not parse: {text!r}'
return visitor.visit(trees) # type: ignore
except Exception as err:
raise TypesysError(f'Could not parse: {text!r}') from err

View File

@ -0,0 +1,597 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""IDL Parser.
Grammar, parse tree visitor and conversion functions for message definitions in
`IDL`_ format.
.. _IDL: https://www.omg.org/spec/IDL/About-IDL/
"""
from __future__ import annotations
import re
from typing import TYPE_CHECKING
from .base import Nodetype, parse_message_definition
from .peg import Visitor, parse_grammar
if TYPE_CHECKING:
from typing import Any, Generator, Optional, Tuple, Union
from .base import Fielddefs, Fielddesc, Typesdict
StringNode = Tuple[Nodetype, str]
ConstValue = Union[str, bool, int, float]
LiteralMatch = Tuple[str, str]
LiteralNode = Tuple[Nodetype, ConstValue]
GRAMMAR_IDL = r"""
specification
= definition+
definition
= macro
/ include
/ module_dcl ';'
/ const_dcl ';'
/ type_dcl ';'
macro
= ifndef
/ define
/ endif
ifndef
= '#ifndef' r'[a-zA-Z0-9_]+'
define
= '#define' r'[a-zA-Z0-9_]+'
endif
= '#endif'
include
= '#include' include_filename
include_filename
= '<' r'[^>]+' '>'
/ '"' r'[^"]+' '"'
module_dcl
= annotation* 'module' identifier '{' definition+ '}'
const_dcl
= 'const' const_type identifier '=' expression
type_dcl
= typedef_dcl
/ constr_type_dcl
typedef_dcl
= 'typedef' type_declarator
type_declarator
= ( template_type_spec
/ simple_type_spec
/ constr_type_dcl
) any_declarators
simple_type_spec
= base_type_spec
/ scoped_name
template_type_spec
= sequence_type
/ string_type
sequence_type
= 'sequence' '<' type_spec ',' expression '>'
/ 'sequence' '<' type_spec '>'
type_spec
= template_type_spec
/ simple_type_spec
any_declarators
= any_declarator (',' any_declarator)*
any_declarator
= array_declarator
/ simple_declarator
constr_type_dcl
= struct_dcl
struct_dcl
= struct_def
struct_def
= annotation* 'struct' identifier '{' member+ '}'
member
= annotation* type_spec declarators ';'
declarators
= declarator (',' declarator)*
declarator
= array_declarator
/ simple_declarator
simple_declarator
= identifier
array_declarator
= identifier fixed_array_size+
fixed_array_size
= '[' expression ']'
annotation
= '@' scoped_name ('(' annotation_params ')')?
annotation_params
= annotation_param (',' annotation_param)*
/ expression
annotation_param
= identifier '=' expression
const_type
= base_type_spec
/ string_type
/ scoped_name
base_type_spec
= integer_type
/ float_type
/ char_type
/ boolean_type
/ octet_type
integer_type
= r'u?int(64|32|16|8)\b'
/ r'(unsigned\s+)?((long\s+)?long|int|short)\b'
float_type
= r'((long\s+)?double|float)\b'
char_type
= r'char\b'
boolean_type
= r'boolean\b'
octet_type
= r'octet\b'
string_type
= 'string' '<' expression '>'
/ r'string\b'
scoped_name
= identifier '::' scoped_name
/ '::' scoped_name
/ identifier
identifier
= r'[a-zA-Z_][a-zA-Z_0-9]*'
expression
= primary_expr binary_operator primary_expr
/ primary_expr
/ unary_operator primary_expr
primary_expr
= literal
/ scoped_name
/ '(' expression ')'
binary_operator
= '|'
/ '^'
/ '&'
/ '<<'
/ '>>'
/ '+'
/ '-'
/ '*'
/ '/'
/ '%'
unary_operator
= '+'
/ '-'
/ '~'
literal
= boolean_literal
/ float_literal
/ integer_literal
/ character_literal
/ string_literals
boolean_literal
= 'TRUE'
/ 'FALSE'
integer_literal
= hexadecimal_literal
/ octal_literal
/ decimal_literal
decimal_literal
= r'[-+]?[1-9][0-9]+'
/ r'[-+]?[0-9]'
octal_literal
= r'[-+]?0[0-7]+'
hexadecimal_literal
= r'[-+]?0[xX][a-fA-F0-9]+'
float_literal
= r'[-+]?[0-9]*\.[0-9]+([eE][-+]?[0-9]+)?'
/ r'[-+]?[0-9]*\.?[0-9]+([eE][-+]?[0-9]+)'
character_literal
= '\'' r'[a-zA-Z0-9_]' '\''
string_literals
= string_literal+
string_literal
= '"' r'(\\"|[^"])*' '"'
"""
class VisitorIDL(Visitor): # pylint: disable=too-many-public-methods
"""IDL file visitor."""
RULES = parse_grammar(
GRAMMAR_IDL,
re.compile(r'(\s|/[*]([^*]|[*](?!/))*[*]/|//[^\n]*$)+', re.M | re.S),
)
def __init__(self) -> None:
"""Initialize."""
super().__init__()
self.typedefs: dict[str, Fielddesc] = {}
# yapf: disable
def visit_specification(
self,
children: tuple[
Optional[
tuple[
tuple[
Nodetype,
list[tuple[Nodetype, tuple[str, str, ConstValue]]],
list[tuple[Nodetype, str, Fielddefs]],
],
LiteralMatch,
],
],
],
) -> Typesdict:
"""Process start symbol, return only children of modules."""
structs: dict[str, Fielddefs] = {}
consts: dict[str, list[tuple[str, str, ConstValue]]] = {}
for item in children:
if item is None or item[0][0] != Nodetype.MODULE:
continue
for csubitem in item[0][1]:
assert csubitem[0] == Nodetype.CONST
if '_Constants/' in csubitem[1][1]:
structname, varname = csubitem[1][1].split('_Constants/')
if structname not in consts:
consts[structname] = []
consts[structname].append((varname, csubitem[1][0], csubitem[1][2]))
for ssubitem in item[0][2]:
assert ssubitem[0] == Nodetype.STRUCT
structs[ssubitem[1]] = ssubitem[2]
if ssubitem[1] not in consts:
consts[ssubitem[1]] = []
return {k: (consts[k], v) for k, v in structs.items()}
# yapf: enable
def visit_macro(self, _: Union[LiteralMatch, tuple[LiteralMatch, str]]) -> None:
"""Process macro, suppress output."""
def visit_include(
self,
_: tuple[LiteralMatch, tuple[LiteralMatch, str, LiteralMatch]],
) -> None:
"""Process include, suppress output."""
# yapf: disable
def visit_module_dcl(
self,
children: tuple[tuple[()], LiteralMatch, StringNode, LiteralMatch, Any, LiteralMatch],
) -> tuple[
Nodetype,
list[tuple[Nodetype, tuple[str, str, ConstValue]]],
list[tuple[Nodetype, str, Fielddefs]],
]:
"""Process module declaration."""
assert len(children) == 6
assert children[2][0] == Nodetype.NAME
name = children[2][1]
definitions = children[4]
consts = []
structs = []
for item in definitions:
if item is None or item[0] is None:
continue
assert item[1] == ('LITERAL', ';')
item = item[0]
if item[0] == Nodetype.CONST:
consts.append(item)
elif item[0] == Nodetype.STRUCT:
structs.append(item)
else:
assert item[0] == Nodetype.MODULE
consts += item[1]
structs += item[2]
consts = [(ityp, (typ, f'{name}/{subname}', val)) for ityp, (typ, subname, val) in consts]
structs = [(typ, f'{name}/{subname}', *rest) for typ, subname, *rest in structs]
return (Nodetype.MODULE, consts, structs)
# yapf: enable
def visit_const_dcl(
self,
children: tuple[LiteralMatch, StringNode, StringNode, LiteralMatch, LiteralNode],
) -> tuple[Nodetype, tuple[str, str, ConstValue]]:
"""Process const declaration."""
return (Nodetype.CONST, (children[1][1], children[2][1], children[4][1]))
def visit_type_dcl(
self,
children: Optional[tuple[Nodetype, str, Fielddefs]],
) -> Optional[tuple[Nodetype, str, Fielddefs]]:
"""Process type, pass structs, suppress otherwise."""
return children if children and children[0] == Nodetype.STRUCT else None
def visit_typedef_dcl(
self,
children: tuple[LiteralMatch, tuple[StringNode, tuple[Any, ...]]],
) -> None:
"""Process type declarator, register type mapping in instance typedef dictionary."""
assert len(children) == 2
dclchildren = children[1]
assert len(dclchildren) == 2
base: Fielddesc
value: Fielddesc
base = typedef if (typedef := self.typedefs.get(dclchildren[0][1])) else dclchildren[0]
flat = [dclchildren[1][0], *[x[1:][0] for x in dclchildren[1][1]]]
for declarator in flat:
if declarator[0] == Nodetype.ADECLARATOR:
typ, name = base
assert isinstance(typ, Nodetype)
assert isinstance(name, str)
assert isinstance(declarator[2][1], int)
value = (Nodetype.ARRAY, ((typ, name), declarator[2][1]))
else:
value = base
self.typedefs[declarator[1][1]] = value
def visit_sequence_type(
self,
children: Union[tuple[LiteralMatch, LiteralMatch, StringNode, LiteralMatch],
tuple[LiteralMatch, LiteralMatch, StringNode, LiteralMatch, LiteralNode,
LiteralMatch]],
) -> tuple[Nodetype, tuple[StringNode, None]]:
"""Process sequence type specification."""
assert len(children) in {4, 6}
if len(children) == 6:
idx = len(children) - 2
assert children[idx][0] == Nodetype.LITERAL_NUMBER
return (Nodetype.SEQUENCE, (children[2], None))
# yapf: disable
def create_struct_field(
self,
parts: tuple[
tuple[()],
Fielddesc,
tuple[
tuple[Nodetype, StringNode],
tuple[
tuple[str, tuple[Nodetype, StringNode]],
...,
],
],
LiteralMatch,
],
) -> Generator[tuple[str, Fielddesc], None, None]:
"""Create struct field and expand typedefs."""
typename, params = parts[1:3]
flat = [params[0], *[x[1:][0] for x in params[1]]]
def resolve_name(name: Fielddesc) -> Fielddesc:
while name[0] == Nodetype.NAME and name[1] in self.typedefs:
assert isinstance(name[1], str)
name = self.typedefs[name[1]]
return name
yield from ((x[1][1], resolve_name(typename)) for x in flat if x)
# yapf: enable
def visit_struct_dcl(
self,
children: tuple[tuple[()], LiteralMatch, StringNode, LiteralMatch, Any, LiteralMatch],
) -> tuple[Nodetype, str, Any]:
"""Process struct declaration."""
assert len(children) == 6
assert children[2][0] == Nodetype.NAME
fields = [y for x in children[4] for y in self.create_struct_field(x)]
return (Nodetype.STRUCT, children[2][1], fields)
def visit_simple_declarator(self, children: StringNode) -> tuple[Nodetype, StringNode]:
"""Process simple declarator."""
assert len(children) == 2
return (Nodetype.SDECLARATOR, children)
def visit_array_declarator(
self,
children: tuple[StringNode, tuple[tuple[LiteralMatch, LiteralNode, LiteralMatch]]],
) -> tuple[Nodetype, StringNode, LiteralNode]:
"""Process array declarator."""
assert len(children) == 2
return (Nodetype.ADECLARATOR, children[0], children[1][0][1])
# yapf: disable
def visit_annotation(
self,
children: tuple[
LiteralMatch,
StringNode,
tuple[
tuple[
LiteralMatch,
tuple[
tuple[StringNode, LiteralMatch, LiteralNode],
tuple[
tuple[LiteralMatch, tuple[StringNode, LiteralMatch, LiteralNode]],
...,
],
],
LiteralMatch,
],
],
],
) -> tuple[Nodetype, str, list[tuple[StringNode, LiteralNode]]]:
"""Process annotation."""
assert len(children) == 3
assert children[1][0] == Nodetype.NAME
params = children[2][0][1]
flat = [params[0], *[x[1:][0] for x in params[1]]]
assert all(len(x) == 3 for x in flat)
retparams = [(x[0], x[2]) for x in flat]
return (Nodetype.ANNOTATION, children[1][1], retparams)
# yapf: enable
def visit_base_type_spec(self, children: str) -> StringNode:
"""Process base type specifier."""
oname = children
name = {
'boolean': 'bool',
'double': 'float64',
'float': 'float32',
'octet': 'uint8',
}.get(oname, oname)
return (Nodetype.BASE, name)
def visit_string_type(
self,
children: Union[str, tuple[LiteralMatch, LiteralMatch, LiteralNode, LiteralMatch]],
) -> Union[StringNode, tuple[Nodetype, str, LiteralNode]]:
"""Prrocess string type specifier."""
if isinstance(children, str):
return (Nodetype.BASE, 'string')
assert len(children) == 4
assert isinstance(children[0], tuple)
return (Nodetype.BASE, 'string', children[2])
def visit_scoped_name(
self,
children: Union[StringNode, tuple[StringNode, LiteralMatch, StringNode]],
) -> StringNode:
"""Process scoped name."""
if len(children) == 2:
assert isinstance(children[1], str)
return (Nodetype.NAME, children[1])
assert len(children) == 3
assert isinstance(children[0], tuple)
assert children[1][1] == '::'
return (Nodetype.NAME, f'{children[0][1]}/{children[2][1]}')
def visit_identifier(self, children: str) -> StringNode:
"""Process identifier."""
return (Nodetype.NAME, children)
def visit_expression(
self,
children: Union[LiteralNode, tuple[LiteralMatch, LiteralNode],
tuple[LiteralNode, LiteralMatch, LiteralNode]],
) -> Union[LiteralNode, tuple[Nodetype, str, int], tuple[Nodetype, str, int, int]]:
"""Process expression, literals are assumed to be integers only."""
if children[0] in [
Nodetype.LITERAL_STRING,
Nodetype.LITERAL_NUMBER,
Nodetype.LITERAL_BOOLEAN,
Nodetype.LITERAL_CHAR,
Nodetype.NAME,
]:
assert isinstance(children[1], (str, bool, int, float))
return (children[0], children[1])
assert isinstance(children[0], tuple)
if len(children) == 3:
assert isinstance(children[0][1], int)
assert isinstance(children[1][1], str)
assert isinstance(children[2][1], int)
return (Nodetype.EXPRESSION_BINARY, children[1][1], children[0][1], children[2][1])
assert len(children) == 2
assert isinstance(children[0][1], str)
assert isinstance(children[1], tuple)
assert isinstance(children[1][1], int)
return (Nodetype.EXPRESSION_UNARY, children[0][1], children[1][1])
def visit_boolean_literal(self, children: str) -> LiteralNode:
"""Process boolean literal."""
return (Nodetype.LITERAL_BOOLEAN, children[1] == 'TRUE')
def visit_float_literal(self, children: str) -> LiteralNode:
"""Process float literal."""
return (Nodetype.LITERAL_NUMBER, float(children))
def visit_decimal_literal(self, children: str) -> LiteralNode:
"""Process decimal integer literal."""
return (Nodetype.LITERAL_NUMBER, int(children))
def visit_octal_literal(self, children: str) -> LiteralNode:
"""Process octal integer literal."""
return (Nodetype.LITERAL_NUMBER, int(children, 8))
def visit_hexadecimal_literal(self, children: str) -> LiteralNode:
"""Process hexadecimal integer literal."""
return (Nodetype.LITERAL_NUMBER, int(children, 16))
def visit_character_literal(
self,
children: tuple[LiteralMatch, str, LiteralMatch],
) -> StringNode:
"""Process char literal."""
return (Nodetype.LITERAL_CHAR, children[1])
def visit_string_literals(
self,
children: tuple[tuple[LiteralMatch, str, LiteralMatch], ...],
) -> StringNode:
"""Process string literal."""
return (
Nodetype.LITERAL_STRING,
''.join(x[1] for x in children),
)
def get_types_from_idl(text: str) -> Typesdict:
"""Get types from idl message definition.
Args:
text: Message definition.
Returns:
List of message message names and parsetrees.
"""
return parse_message_definition(VisitorIDL(), text)

View File

@ -0,0 +1,449 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""MSG Parser.
Grammar, parse tree visitor and conversion functions for message definitions in
`MSG`_ format. It also supports concatened message definitions as found in
Rosbag1 connection information.
.. _MSG: http://wiki.ros.org/msg
"""
from __future__ import annotations
import re
from hashlib import md5
from pathlib import PurePosixPath as Path
from typing import TYPE_CHECKING
from .base import Nodetype, TypesysError, parse_message_definition
from .peg import Rule, Visitor, parse_grammar
from .types import FIELDDEFS
if TYPE_CHECKING:
from typing import Optional, Tuple, TypeVar, Union
from .base import Constdefs, Fielddefs, Fielddesc, Typesdict
T = TypeVar('T')
StringNode = Tuple[Nodetype, str]
ConstValue = Union[str, bool, int, float]
Msgdesc = Tuple[Tuple[StringNode, Tuple[str, str, int], str], ...]
LiteralMatch = Tuple[str, str]
GRAMMAR_MSG = r"""
specification
= msgdef (msgsep msgdef)*
msgdef
= r'MSG:\s' scoped_name definition*
msgsep
= r'================================================================================'
definition
= const_dcl
/ field_dcl
const_dcl
= 'string' identifier '=' r'(?!={79}\n)[^\n]+'
/ type_spec identifier '=' float_literal
/ type_spec identifier '=' integer_literal
/ type_spec identifier '=' boolean_literal
field_dcl
= type_spec identifier default_value?
type_spec
= array_type_spec
/ bounded_array_type_spec
/ simple_type_spec
array_type_spec
= simple_type_spec array_size
bounded_array_type_spec
= simple_type_spec array_bounds
simple_type_spec
= 'string' '<=' integer_literal
/ scoped_name
array_size
= '[' integer_literal? ']'
array_bounds
= '[<=' integer_literal ']'
scoped_name
= identifier '/' scoped_name
/ identifier
identifier
= r'[a-zA-Z_][a-zA-Z_0-9]*'
default_value
= literal
literal
= float_literal
/ integer_literal
/ boolean_literal
/ string_literal
/ array_literal
boolean_literal
= r'[tT][rR][uU][eE]'
/ r'[fF][aA][lL][sS][eE]'
/ '0'
/ '1'
integer_literal
= hexadecimal_literal
/ octal_literal
/ decimal_literal
decimal_literal
= r'[-+]?[1-9][0-9]+'
/ r'[-+]?[0-9]'
octal_literal
= r'[-+]?0[0-7]+'
hexadecimal_literal
= r'[-+]?0[xX][a-fA-F0-9]+'
float_literal
= r'[-+]?[0-9]*\.[0-9]+([eE][-+]?[0-9]+)?'
/ r'[-+]?[0-9]*\.?[0-9]+([eE][-+]?[0-9]+)'
string_literal
= '"' r'(\\"|[^"])*' '"'
/ '\'' r'(\\\'|[^'])*' '\''
array_literal
= '[' array_elements? ']'
array_elements
= literal ',' array_elements
/ literal
"""
def normalize_msgtype(name: str) -> str:
"""Normalize message typename.
Args:
name: Message typename.
Returns:
Normalized name.
"""
path = Path(name)
if path.parent.name != 'msg':
path = path.parent / 'msg' / path.name
return str(path)
def normalize_fieldtype(typename: str, field: Fielddesc, names: list[str]) -> Fielddesc:
"""Normalize field typename.
Args:
typename: Type name of field owner.
field: Field definition.
names: Valid message names.
Returns:
Normalized fieldtype.
"""
dct = {Path(name).name: name for name in names}
ftype, args = field
name = args if ftype == Nodetype.NAME else args[0][1]
assert isinstance(name, str)
if name in VisitorMSG.BASETYPES:
ifield = (Nodetype.BASE, name)
else:
if name in dct:
name = dct[name]
elif name == 'Header':
name = 'std_msgs/msg/Header'
elif '/' not in name:
name = str(Path(typename).parent / name)
elif '/msg/' not in name:
name = str((path := Path(name)).parent / 'msg' / path.name)
ifield = (Nodetype.NAME, name)
if ftype == Nodetype.NAME:
return ifield
assert not isinstance(args, str)
return (ftype, (ifield, args[1]))
def denormalize_msgtype(typename: str) -> str:
"""Undo message tyoename normalization.
Args:
typename: Normalized message typename.
Returns:
ROS1 style name.
"""
assert '/msg/' in typename
return str((path := Path(typename)).parent.parent / path.name)
class VisitorMSG(Visitor):
"""MSG file visitor."""
RULES = parse_grammar(GRAMMAR_MSG, re.compile(r'(\s|#[^\n]*$)+', re.M | re.S))
BASETYPES = {
'bool',
'int8',
'int16',
'int32',
'int64',
'uint8',
'uint16',
'uint32',
'uint64',
'float32',
'float64',
'string',
}
def visit_const_dcl(
self,
children: tuple[StringNode, StringNode, LiteralMatch, ConstValue],
) -> tuple[StringNode, tuple[str, str, ConstValue]]:
"""Process const declaration, suppress output."""
value: Union[str, bool, int, float]
if (typ := children[0][1]) == 'string':
assert isinstance(children[3], str)
value = children[3].strip()
else:
value = children[3]
return (Nodetype.CONST, ''), (typ, children[1][1], value)
def visit_specification(
self,
children: tuple[tuple[str, Msgdesc], tuple[tuple[str, tuple[str, Msgdesc]], ...]],
) -> Typesdict:
"""Process start symbol."""
typelist = [children[0], *[x[1] for x in children[1]]]
typedict = dict(typelist)
names = list(typedict.keys())
res: Typesdict = {}
for name, items in typedict.items():
consts: Constdefs = [
(x[1][1], x[1][0], x[1][2]) for x in items if x[0] == (Nodetype.CONST, '')
]
fields: Fielddefs = [
(field[1][1], normalize_fieldtype(name, field[0], names))
for field in items
if field[0] != (Nodetype.CONST, '')
]
res[name] = consts, fields
return res
def visit_msgdef(
self,
children: tuple[str, StringNode, tuple[Optional[T]]],
) -> tuple[str, tuple[T, ...]]:
"""Process single message definition."""
assert len(children) == 3
return normalize_msgtype(children[1][1]), tuple(x for x in children[2] if x is not None)
def visit_msgsep(self, _: str) -> None:
"""Process message separator, suppress output."""
def visit_array_type_spec(
self,
children: tuple[StringNode, tuple[LiteralMatch, tuple[int, ...], LiteralMatch]],
) -> tuple[Nodetype, tuple[StringNode, Optional[int]]]:
"""Process array type specifier."""
if length := children[1][1]:
return Nodetype.ARRAY, (children[0], length[0])
return Nodetype.SEQUENCE, (children[0], None)
def visit_bounded_array_type_spec(
self,
children: list[StringNode],
) -> tuple[Nodetype, tuple[StringNode, None]]:
"""Process bounded array type specifier."""
return Nodetype.SEQUENCE, (children[0], None)
def visit_simple_type_spec(
self,
children: Union[StringNode, tuple[LiteralMatch, LiteralMatch, int]],
) -> StringNode:
"""Process simple type specifier."""
if len(children) > 2:
assert (Rule.LIT, '<=') in children
assert isinstance(children[0], tuple)
typespec = children[0][1]
else:
assert isinstance(children[1], str)
typespec = children[1]
dct = {
'time': 'builtin_interfaces/msg/Time',
'duration': 'builtin_interfaces/msg/Duration',
'byte': 'uint8',
'char': 'uint8',
}
return Nodetype.NAME, dct.get(typespec, typespec)
def visit_scoped_name(
self,
children: Union[StringNode, tuple[StringNode, LiteralMatch, StringNode]],
) -> StringNode:
"""Process scoped name."""
if len(children) == 2:
return children # type: ignore
assert len(children) == 3
return (Nodetype.NAME, '/'.join(x[1] for x in children if x[0] != Rule.LIT)) # type: ignore
def visit_identifier(self, children: str) -> StringNode:
"""Process identifier."""
return (Nodetype.NAME, children)
def visit_boolean_literal(self, children: str) -> bool:
"""Process boolean literal."""
return children.lower() in {'true', '1'}
def visit_float_literal(self, children: str) -> float:
"""Process float literal."""
return float(children)
def visit_decimal_literal(self, children: str) -> int:
"""Process decimal integer literal."""
return int(children)
def visit_octal_literal(self, children: str) -> int:
"""Process octal integer literal."""
return int(children, 8)
def visit_hexadecimal_literal(self, children: str) -> int:
"""Process hexadecimal integer literal."""
return int(children, 16)
def visit_string_literal(self, children: str) -> str:
"""Process integer literal."""
return children[1]
def get_types_from_msg(text: str, name: str) -> Typesdict:
"""Get type from msg message definition.
Args:
text: Message definiton.
name: Message typename.
Returns:
list with single message name and parsetree.
"""
return parse_message_definition(VisitorMSG(), f'MSG: {name}\n{text}')
def gendefhash(typename: str, subdefs: dict[str, tuple[str, str]]) -> tuple[str, str]:
"""Generate message definition and hash for type.
The subdefs argument will be filled with child definitions.
Args:
typename: Name of type to generate definition for.
subdefs: Child definitions.
Returns:
Message definition and hash.
Raises:
TypesysError: Type does not exist.
"""
# pylint: disable=too-many-branches
typemap = {
'builtin_interfaces/msg/Time': 'time',
'builtin_interfaces/msg/Duration': 'duration',
}
deftext: list[str] = []
hashtext: list[str] = []
if typename not in FIELDDEFS:
raise TypesysError(f'Type {typename!r} is unknown.')
for name, typ, value in FIELDDEFS[typename][0]:
deftext.append(f'{typ} {name}={value}')
hashtext.append(f'{typ} {name}={value}')
for name, (ftype, args) in FIELDDEFS[typename][1]:
if ftype == Nodetype.BASE:
deftext.append(f'{args} {name}')
hashtext.append(f'{args} {name}')
elif ftype == Nodetype.NAME:
assert isinstance(args, str)
subname = args
if subname in typemap:
deftext.append(f'{typemap[subname]} {name}')
hashtext.append(f'{typemap[subname]} {name}')
else:
if subname not in subdefs:
subdefs[subname] = ('', '')
subdefs[subname] = gendefhash(subname, subdefs)
deftext.append(f'{denormalize_msgtype(subname)} {name}')
hashtext.append(f'{subdefs[subname][1]} {name}')
else:
assert isinstance(args, tuple)
subdesc, num = args
count = '' if num is None else str(num)
subtype, subname = subdesc
if subtype == Nodetype.BASE:
deftext.append(f'{subname}[{count}] {name}')
hashtext.append(f'{subname}[{count}] {name}')
elif subname in typemap:
deftext.append(f'{typemap[subname]}[{count}] {name}')
hashtext.append(f'{typemap[subname]}[{count}] {name}')
else:
if subname not in subdefs:
subdefs[subname] = ('', '')
subdefs[subname] = gendefhash(subname, subdefs)
deftext.append(f'{denormalize_msgtype(subname)}[{count}] {name}')
hashtext.append(f'{subdefs[subname][1]} {name}')
if typename == 'std_msgs/msg/Header':
deftext.insert(0, 'uint32 seq')
hashtext.insert(0, 'uint32 seq')
deftext.append('')
return '\n'.join(deftext), md5('\n'.join(hashtext).encode()).hexdigest()
def generate_msgdef(typename: str) -> tuple[str, str]:
"""Generate message definition for type.
Args:
typename: Name of type to generate definition for.
Returns:
Message definition.
"""
subdefs: dict[str, tuple[str, str]] = {}
msgdef, md5sum = gendefhash(typename, subdefs)
msgdef = ''.join(
[
msgdef,
*[f'{"=" * 80}\nMSG: {denormalize_msgtype(k)}\n{v[0]}' for k, v in subdefs.items()],
],
)
return msgdef, md5sum

View File

@ -0,0 +1,314 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""PEG Parser.
Parsing expression grammar inspired parser for simple EBNF-like notations. It
implements just enough features to support parsing of the different ROS message
definition formats.
"""
from __future__ import annotations
import re
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from typing import Any, Optional, Pattern, TypeVar, Union
Tree = Any
T = TypeVar('T')
class Rule:
"""Rule base class."""
LIT = 'LITERAL'
def __init__(
self,
value: Union[str, Pattern[str], Rule, list[Rule]],
rules: dict[str, Rule],
whitespace: Pattern[str],
name: Optional[str] = None,
):
"""Initialize.
Args:
value: Value of this rule.
rules: Grammar containing all rules.
whitespace: Whitespace pattern.
name: Name of this rule.
"""
self.value = value
self.rules = rules
self.name = name
self.whitespace = whitespace
def skip_ws(self, text: str, pos: int) -> int:
"""Skip whitespace."""
match = self.whitespace.match(text, pos)
return match.span()[1] if match else pos
def make_node(self, data: T) -> Union[T, dict[str, Union[str, T]]]:
"""Make node for parse tree."""
return {'node': self.name, 'data': data} if self.name else data
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
raise NotImplementedError # pragma: no cover
class RuleLiteral(Rule):
"""Rule to match string literal."""
def __init__(
self,
value: str,
rules: dict[str, Rule],
whitespace: Pattern[str],
name: Optional[str] = None,
):
"""Initialize.
Args:
value: Value of this rule.
rules: Grammar containing all rules.
whitespace: Whitespace pattern.
name: Name of this rule.
"""
super().__init__(value, rules, whitespace, name)
self.value = value[1:-1].replace('\\\'', '\'')
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
value = self.value
assert isinstance(value, str)
if text[pos:pos + len(value)] == value:
npos = pos + len(value)
npos = self.skip_ws(text, npos)
return npos, (self.LIT, value)
return -1, ()
class RuleRegex(Rule):
"""Rule to match regular expression."""
value: Pattern[str]
def __init__(
self,
value: str,
rules: dict[str, Rule],
whitespace: Pattern[str],
name: Optional[str] = None,
):
"""Initialize.
Args:
value: Value of this rule.
rules: Grammar containing all rules.
whitespace: Whitespace pattern.
name: Name of this rule.
"""
super().__init__(value, rules, whitespace, name)
self.value = re.compile(value[2:-1], re.M | re.S)
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
match = self.value.match(text, pos)
if not match:
return -1, ()
npos = self.skip_ws(text, match.span()[1])
return npos, self.make_node(match.group())
class RuleToken(Rule):
"""Rule to match token."""
value: str
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
token = self.rules[self.value]
npos, data = token.parse(text, pos)
if npos == -1:
return npos, data
return npos, self.make_node(data)
class RuleOneof(Rule):
"""Rule to match first matching subrule."""
value: list[Rule]
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
for value in self.value:
npos, data = value.parse(text, pos)
if npos != -1:
return npos, self.make_node(data)
return -1, ()
class RuleSequence(Rule):
"""Rule to match a sequence of subrules."""
value: list[Rule]
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
data = []
npos = pos
for value in self.value:
npos, node = value.parse(text, npos)
if npos == -1:
return -1, ()
data.append(node)
return npos, self.make_node(tuple(data))
class RuleZeroPlus(Rule):
"""Rule to match zero or more occurences of subrule."""
value: Rule
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
data: list[Any] = []
lpos = pos
while True:
npos, node = self.value.parse(text, lpos)
if npos == -1:
return lpos, self.make_node(tuple(data))
data.append(node)
lpos = npos
class RuleOnePlus(Rule):
"""Rule to match one or more occurences of subrule."""
value: Rule
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
npos, node = self.value.parse(text, pos)
if npos == -1:
return -1, ()
data = [node]
lpos = npos
while True:
npos, node = self.value.parse(text, lpos)
if npos == -1:
return lpos, self.make_node(tuple(data))
data.append(node)
lpos = npos
class RuleZeroOne(Rule):
"""Rule to match zero or one occurence of subrule."""
value: Rule
def parse(self, text: str, pos: int) -> tuple[int, Any]:
"""Apply rule at position."""
npos, node = self.value.parse(text, pos)
if npos == -1:
return pos, self.make_node(())
return npos, self.make_node((node,))
class Visitor: # pylint: disable=too-few-public-methods
"""Visitor transforming parse trees."""
RULES: dict[str, Rule] = {}
def __init__(self) -> None:
"""Initialize."""
def visit(self, tree: Tree) -> Tree:
"""Visit all nodes in parse tree."""
if isinstance(tree, tuple):
return tuple(self.visit(x) for x in tree)
if isinstance(tree, str):
return tree
assert isinstance(tree, dict), tree
assert list(tree.keys()) == ['node', 'data'], tree.keys()
tree['data'] = self.visit(tree['data'])
func = getattr(self, f'visit_{tree["node"]}', lambda x: x)
return func(tree['data'])
def split_token(tok: str) -> list[str]:
"""Split repetition and grouping tokens."""
return list(filter(None, re.split(r'(^\()|(\)(?=[*+?]?$))|([*+?]$)', tok)))
def collapse_tokens(
toks: list[Optional[Rule]],
rules: dict[str, Rule],
whitespace: Pattern[str],
) -> Rule:
"""Collapse linear list of tokens to oneof of sequences."""
value: list[Rule] = []
seq: list[Rule] = []
for tok in toks:
if tok:
seq.append(tok)
else:
value.append(RuleSequence(seq, rules, whitespace) if len(seq) > 1 else seq[0])
seq = []
value.append(RuleSequence(seq, rules, whitespace) if len(seq) > 1 else seq[0])
return RuleOneof(value, rules, whitespace) if len(value) > 1 else value[0]
def parse_grammar(
grammar: str,
whitespace: Pattern[str] = re.compile(r'\s+', re.M | re.S),
) -> dict[str, Rule]:
"""Parse grammar into rule dictionary."""
rules: dict[str, Rule] = {}
for token in grammar.split('\n\n'):
lines = token.strip().split('\n')
name, *defs = lines
items = [z for x in defs for y in x.split(' ') if y for z in split_token(y) if z]
assert items
assert items[0] == '='
items.pop(0)
stack: list[Optional[Rule]] = []
parens: list[int] = []
while items:
tok = items.pop(0)
if tok in ['*', '+', '?']:
assert isinstance(stack[-1], Rule)
stack[-1] = {
'*': RuleZeroPlus,
'+': RuleOnePlus,
'?': RuleZeroOne,
}[tok](stack[-1], rules, whitespace)
elif tok == '/':
stack.append(None)
elif tok == '(':
parens.append(len(stack))
elif tok == ')':
index = parens.pop()
rule = collapse_tokens(stack[index:], rules, whitespace)
stack = stack[:index]
stack.append(rule)
elif len(tok) > 2 and tok[:2] == 'r\'':
stack.append(RuleRegex(tok, rules, whitespace))
elif tok[0] == '\'':
stack.append(RuleLiteral(tok, rules, whitespace))
else:
stack.append(RuleToken(tok, rules, whitespace))
res = collapse_tokens(stack, rules, whitespace)
res.name = name
rules[name] = res
return rules

View File

View File

@ -0,0 +1,175 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Code generators and registration functions for the extensible type system."""
from __future__ import annotations
import re
import sys
from importlib.util import module_from_spec, spec_from_loader
from typing import TYPE_CHECKING
from . import types
from .base import Nodetype, TypesysError
if TYPE_CHECKING:
from typing import Any, Optional, Protocol, Union
from .base import Typesdict
class Typestore(Protocol): # pylint: disable=too-few-public-methods
"""Type storage."""
FIELDDEFS: Typesdict
INTLIKE = re.compile('^u?(bool|int|float)')
def get_typehint(desc: tuple[int, Union[str, tuple[tuple[int, str], Optional[int]]]]) -> str:
"""Get python type hint for field.
Args:
desc: Field descriptor.
Returns:
Type hint for field.
"""
if desc[0] == Nodetype.BASE:
assert isinstance(desc[1], str)
return match.group(1) if (match := INTLIKE.match(desc[1])) else 'str'
if desc[0] == Nodetype.NAME:
assert isinstance(desc[1], str)
return desc[1].replace('/', '__')
sub = desc[1][0]
if INTLIKE.match(sub[1]):
typ = 'bool_' if sub[1] == 'bool' else sub[1]
return f'numpy.ndarray[Any, numpy.dtype[numpy.{typ}]]'
assert isinstance(sub, tuple)
return f'list[{get_typehint(sub)}]'
def generate_python_code(typs: Typesdict) -> str:
"""Generate python code from types dictionary.
Args:
typs: Dictionary mapping message typenames to parsetrees.
Returns:
Code for importable python module.
"""
lines = [
'# Copyright 2020-2023 Ternaris.',
'# SPDX-License-Identifier: Apache-2.0',
'#',
'# THIS FILE IS GENERATED, DO NOT EDIT',
'"""ROS2 message types."""',
'',
'# flake8: noqa N801',
'# pylint: disable=invalid-name,too-many-instance-attributes,too-many-lines',
'',
'from __future__ import annotations',
'',
'from dataclasses import dataclass',
'from typing import TYPE_CHECKING',
'',
'if TYPE_CHECKING:',
' from typing import Any, ClassVar',
'',
' import numpy',
'',
' from .base import Typesdict',
'',
'',
]
for name, (consts, fields) in typs.items():
pyname = name.replace('/', '__')
lines += [
'@dataclass',
f'class {pyname}:',
f' """Class for {name}."""',
'',
*[f' {fname}: {get_typehint(desc)}' for fname, desc in fields],
*[
f' {fname}: ClassVar[{get_typehint((1, ftype))}] = {fvalue!r}'
for fname, ftype, fvalue in consts
],
f' __msgtype__: ClassVar[str] = {name!r}',
]
lines += [
'',
'',
]
def get_ftype(ftype: tuple[int, Any]) -> tuple[int, Any]:
if ftype[0] <= 2:
return int(ftype[0]), ftype[1]
return int(ftype[0]), ((int(ftype[1][0][0]), ftype[1][0][1]), ftype[1][1])
lines += ['FIELDDEFS: Typesdict = {']
for name, (consts, fields) in typs.items():
pyname = name.replace('/', '__')
lines += [
f' \'{name}\': (',
*(
[
' [',
*[
f' ({fname!r}, {ftype!r}, {fvalue!r}),'
for fname, ftype, fvalue in consts
],
' ],',
] if consts else [' [],']
),
' [',
*[f' ({fname!r}, {get_ftype(ftype)!r}),' for fname, ftype in fields],
' ],',
' ),',
]
lines += [
'}',
'',
]
return '\n'.join(lines)
def register_types(typs: Typesdict, typestore: Typestore = types) -> None:
"""Register types in type system.
Args:
typs: Dictionary mapping message typenames to parsetrees.
typestore: Type store.
Raises:
TypesysError: Type already present with different definition.
"""
code = generate_python_code(typs)
name = 'rosbags.usertypes'
spec = spec_from_loader(name, loader=None)
assert spec
module = module_from_spec(spec)
sys.modules[name] = module
exec(code, module.__dict__) # pylint: disable=exec-used
fielddefs: Typesdict = module.FIELDDEFS
for name, (_, fields) in fielddefs.items():
if name == 'std_msgs/msg/Header':
continue
if have := typestore.FIELDDEFS.get(name):
_, have_fields = have
have_fields = [(x[0].lower(), x[1]) for x in have_fields]
fields = [(x[0].lower(), x[1]) for x in fields]
if have_fields != fields:
raise TypesysError(f'Type {name!r} is already present with different definition.')
for name in fielddefs.keys() - typestore.FIELDDEFS.keys():
pyname = name.replace('/', '__')
setattr(typestore, pyname, getattr(module, pyname))
typestore.FIELDDEFS[name] = fielddefs[name]

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,3 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag tests."""

445
rosbags/tests/cdr.py Normal file
View File

@ -0,0 +1,445 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Reference CDR message serializer and deserializer."""
from __future__ import annotations
import sys
from struct import Struct, pack_into, unpack_from
from typing import TYPE_CHECKING, Dict, List, Union, cast
import numpy
from numpy.typing import NDArray
from rosbags.serde.messages import SerdeError, get_msgdef
from rosbags.serde.typing import Msgdef
from rosbags.serde.utils import SIZEMAP, Valtype
from rosbags.typesys import types
if TYPE_CHECKING:
from typing import Any, Tuple
from rosbags.serde.typing import Descriptor
Array = Union[List[Msgdef], List[str], numpy.ndarray]
BasetypeMap = Dict[str, Struct]
BASETYPEMAP_LE: BasetypeMap = {
'bool': Struct('?'),
'int8': Struct('b'),
'int16': Struct('<h'),
'int32': Struct('<i'),
'int64': Struct('<q'),
'uint8': Struct('B'),
'uint16': Struct('<H'),
'uint32': Struct('<I'),
'uint64': Struct('<Q'),
'float32': Struct('<f'),
'float64': Struct('<d'),
}
BASETYPEMAP_BE: BasetypeMap = {
'bool': Struct('?'),
'int8': Struct('b'),
'int16': Struct('>h'),
'int32': Struct('>i'),
'int64': Struct('>q'),
'uint8': Struct('B'),
'uint16': Struct('>H'),
'uint32': Struct('>I'),
'uint64': Struct('>Q'),
'float32': Struct('>f'),
'float64': Struct('>d'),
}
def deserialize_number(rawdata: bytes, bmap: BasetypeMap, pos: int, basetype: str) \
-> Tuple[Union[bool, float, int], int]:
"""Deserialize a single boolean, float, or int.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Read position.
basetype: Number type string.
Returns:
Deserialized number and new read position.
"""
dtype, size = bmap[basetype], SIZEMAP[basetype]
pos = (pos + size - 1) & -size
return dtype.unpack_from(rawdata, pos)[0], pos + size
def deserialize_string(rawdata: bytes, bmap: BasetypeMap, pos: int) \
-> Tuple[str, int]:
"""Deserialize a string value.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Read position.
Returns:
Deserialized string and new read position.
"""
pos = (pos + 4 - 1) & -4
length = bmap['int32'].unpack_from(rawdata, pos)[0]
val = bytes(rawdata[pos + 4:pos + 4 + length - 1])
return val.decode(), pos + 4 + length
def deserialize_array(rawdata: bytes, bmap: BasetypeMap, pos: int, num: int, desc: Descriptor) \
-> Tuple[Array, int]:
"""Deserialize an array of items of same type.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Read position.
num: Number of elements.
desc: Element type descriptor.
Returns:
Deserialized array and new read position.
Raises:
SerdeError: Unexpected element type.
"""
if desc.valtype == Valtype.BASE:
if desc.args == 'string':
strs = []
while (num := num - 1) >= 0:
val, pos = deserialize_string(rawdata, bmap, pos)
strs.append(val)
return strs, pos
size = SIZEMAP[desc.args]
pos = (pos + size - 1) & -size
ndarr = numpy.frombuffer(rawdata, dtype=desc.args, count=num, offset=pos)
if (bmap is BASETYPEMAP_LE) != (sys.byteorder == 'little'):
ndarr = ndarr.byteswap() # no inplace on readonly array
return ndarr, pos + num * SIZEMAP[desc.args]
if desc.valtype == Valtype.MESSAGE:
msgs = []
while (num := num - 1) >= 0:
msg, pos = deserialize_message(rawdata, bmap, pos, desc.args)
msgs.append(msg)
return msgs, pos
raise SerdeError(f'Nested arrays {desc!r} are not supported.')
def deserialize_message(rawdata: bytes, bmap: BasetypeMap, pos: int, msgdef: Msgdef) \
-> Tuple[Msgdef, int]:
"""Deserialize a message.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Read position.
msgdef: Message definition.
Returns:
Deserialized message and new read position.
"""
values: List[Any] = []
for _, desc in msgdef.fields:
if desc.valtype == Valtype.MESSAGE:
obj, pos = deserialize_message(rawdata, bmap, pos, desc.args)
values.append(obj)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
val, pos = deserialize_string(rawdata, bmap, pos)
values.append(val)
else:
num, pos = deserialize_number(rawdata, bmap, pos, desc.args)
values.append(num)
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
arr, pos = deserialize_array(rawdata, bmap, pos, length, subdesc)
values.append(arr)
elif desc.valtype == Valtype.SEQUENCE:
size, pos = deserialize_number(rawdata, bmap, pos, 'int32')
arr, pos = deserialize_array(rawdata, bmap, pos, int(size), desc.args[0])
values.append(arr)
return msgdef.cls(*values), pos
def deserialize(rawdata: bytes, typename: str) -> Msgdef:
"""Deserialize raw data into a message object.
Args:
rawdata: Serialized data.
typename: Type to deserialize.
Returns:
Deserialized message object.
"""
_, little_endian = unpack_from('BB', rawdata, 0)
msgdef = get_msgdef(typename, types)
obj, _ = deserialize_message(
rawdata[4:],
BASETYPEMAP_LE if little_endian else BASETYPEMAP_BE,
0,
msgdef,
)
return obj
def serialize_number(
rawdata: memoryview,
bmap: BasetypeMap,
pos: int,
basetype: str,
val: Union[bool, float, int],
) -> int:
"""Serialize a single boolean, float, or int.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Write position.
basetype: Number type string.
val: Value to serialize.
Returns:
Next write position.
"""
dtype, size = bmap[basetype], SIZEMAP[basetype]
pos = (pos + size - 1) & -size
dtype.pack_into(rawdata, pos, val)
return pos + size
def serialize_string(rawdata: memoryview, bmap: BasetypeMap, pos: int, val: str) \
-> int:
"""Deserialize a string value.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Write position.
val: Value to serialize.
Returns:
Next write position.
"""
bval = memoryview(val.encode())
length = len(bval) + 1
pos = (pos + 4 - 1) & -4
bmap['int32'].pack_into(rawdata, pos, length)
rawdata[pos + 4:pos + 4 + length - 1] = bval
return pos + 4 + length
def serialize_array(
rawdata: memoryview,
bmap: BasetypeMap,
pos: int,
desc: Descriptor,
val: Array,
) -> int:
"""Serialize an array of items of same type.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Write position.
desc: Element type descriptor.
val: Value to serialize.
Returns:
Next write position.
Raises:
SerdeError: Unexpected element type.
"""
if desc.valtype == Valtype.BASE:
if desc.args == 'string':
for item in val:
pos = serialize_string(rawdata, bmap, pos, cast('str', item))
return pos
size = SIZEMAP[desc.args]
pos = (pos + size - 1) & -size
size *= len(val)
val = cast('NDArray[numpy.int_]', val)
if (bmap is BASETYPEMAP_LE) != (sys.byteorder == 'little'):
val = val.byteswap() # no inplace on readonly array
rawdata[pos:pos + size] = memoryview(val.tobytes())
return pos + size
if desc.valtype == Valtype.MESSAGE:
for item in val:
pos = serialize_message(rawdata, bmap, pos, item, desc.args)
return pos
raise SerdeError(f'Nested arrays {desc!r} are not supported.') # pragma: no cover
def serialize_message(
rawdata: memoryview,
bmap: BasetypeMap,
pos: int,
message: object,
msgdef: Msgdef,
) -> int:
"""Serialize a message.
Args:
rawdata: Serialized data.
bmap: Basetype metadata.
pos: Write position.
message: Message object.
msgdef: Message definition.
Returns:
Next write position.
"""
for fieldname, desc in msgdef.fields:
val = getattr(message, fieldname)
if desc.valtype == Valtype.MESSAGE:
pos = serialize_message(rawdata, bmap, pos, val, desc.args)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
pos = serialize_string(rawdata, bmap, pos, val)
else:
pos = serialize_number(rawdata, bmap, pos, desc.args, val)
elif desc.valtype == Valtype.ARRAY:
pos = serialize_array(rawdata, bmap, pos, desc.args[0], val)
elif desc.valtype == Valtype.SEQUENCE:
size = len(val)
pos = serialize_number(rawdata, bmap, pos, 'int32', size)
pos = serialize_array(rawdata, bmap, pos, desc.args[0], val)
return pos
def get_array_size(desc: Descriptor, val: Array, size: int) -> int:
"""Calculate size of an array.
Args:
desc: Element type descriptor.
val: Array to calculate size of.
size: Current size of message.
Returns:
Size of val in bytes.
Raises:
SerdeError: Unexpected element type.
"""
if desc.valtype == Valtype.BASE:
if desc.args == 'string':
for item in val:
size = (size + 4 - 1) & -4
size += 4 + len(item) + 1
return size
isize = SIZEMAP[desc.args]
size = (size + isize - 1) & -isize
return size + isize * len(val)
if desc.valtype == Valtype.MESSAGE:
for item in val:
size = get_size(item, desc.args, size)
return size
raise SerdeError(f'Nested arrays {desc!r} are not supported.') # pragma: no cover
def get_size(message: object, msgdef: Msgdef, size: int = 0) -> int:
"""Calculate size of serialzied message.
Args:
message: Message object.
msgdef: Message definition.
size: Current size of message.
Returns:
Size of message in bytes.
Raises:
SerdeError: Unexpected array length in message.
"""
for fieldname, desc in msgdef.fields:
val = getattr(message, fieldname)
if desc.valtype == Valtype.MESSAGE:
size = get_size(val, desc.args, size)
elif desc.valtype == Valtype.BASE:
if desc.args == 'string':
size = (size + 4 - 1) & -4
size += 4 + len(val.encode()) + 1
else:
isize = SIZEMAP[desc.args]
size = (size + isize - 1) & -isize
size += isize
elif desc.valtype == Valtype.ARRAY:
subdesc, length = desc.args
if len(val) != length:
raise SerdeError(f'Unexpected array length: {len(val)} != {length}.')
size = get_array_size(subdesc, val, size)
elif desc.valtype == Valtype.SEQUENCE:
size = (size + 4 - 1) & -4
size += 4
size = get_array_size(desc.args[0], val, size)
return size
def serialize(
message: object,
typename: str,
little_endian: bool = sys.byteorder == 'little',
) -> memoryview:
"""Serialize message object to bytes.
Args:
message: Message object.
typename: Type to serialize.
little_endian: Should use little endianess.
Returns:
Serialized bytes.
"""
msgdef = get_msgdef(typename, types)
size = 4 + get_size(message, msgdef)
rawdata = memoryview(bytearray(size))
pack_into('BB', rawdata, 0, 0, little_endian)
pos = serialize_message(
rawdata[4:],
BASETYPEMAP_LE if little_endian else BASETYPEMAP_BE,
0,
message,
msgdef,
)
assert pos + 4 == size
return rawdata.toreadonly()

View File

@ -0,0 +1,444 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Rosbag1to2 converter tests."""
from __future__ import annotations
import sys
from pathlib import Path
from typing import TYPE_CHECKING
from unittest.mock import call, patch
import pytest
from rosbags.convert import ConverterError, convert
from rosbags.convert.__main__ import main
from rosbags.convert.converter import LATCH
from rosbags.interfaces import Connection, ConnectionExtRosbag1, ConnectionExtRosbag2
from rosbags.rosbag1 import ReaderError
from rosbags.rosbag2 import WriterError
if TYPE_CHECKING:
from typing import Any
def test_cliwrapper(tmp_path: Path) -> None:
"""Test cli wrapper."""
(tmp_path / 'subdir').mkdir()
(tmp_path / 'ros1.bag').write_text('')
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt']), \
pytest.raises(SystemExit):
main()
assert not cvrt.called
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt', str(tmp_path / 'no.bag')]), \
pytest.raises(SystemExit):
main()
assert not cvrt.called
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt', str(tmp_path / 'ros1.bag')]):
main()
cvrt.assert_called_with(
src=tmp_path / 'ros1.bag',
dst=None,
exclude_topics=[],
include_topics=[],
)
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt',
str(tmp_path / 'ros1.bag'),
'--dst',
str(tmp_path / 'subdir')]), \
pytest.raises(SystemExit):
main()
assert not cvrt.called
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt',
str(tmp_path / 'ros1.bag'),
'--dst',
str(tmp_path / 'ros2.bag')]), \
pytest.raises(SystemExit):
main()
assert not cvrt.called
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt',
str(tmp_path / 'ros1.bag'),
'--dst',
str(tmp_path / 'target')]):
main()
cvrt.assert_called_with(
src=tmp_path / 'ros1.bag',
dst=tmp_path / 'target',
exclude_topics=[],
include_topics=[],
)
with patch.object(sys, 'argv', ['cvt', str(tmp_path / 'ros1.bag')]), \
patch('builtins.print') as mock_print, \
patch('rosbags.convert.__main__.convert', side_effect=ConverterError('exc')), \
pytest.raises(SystemExit):
main()
mock_print.assert_called_with('ERROR: exc')
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt', str(tmp_path / 'subdir')]):
main()
cvrt.assert_called_with(
src=tmp_path / 'subdir',
dst=None,
exclude_topics=[],
include_topics=[],
)
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt',
str(tmp_path / 'subdir'),
'--dst',
str(tmp_path / 'ros1.bag')]), \
pytest.raises(SystemExit):
main()
assert not cvrt.called
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt',
str(tmp_path / 'subdir'),
'--dst',
str(tmp_path / 'target.bag')]):
main()
cvrt.assert_called_with(
src=tmp_path / 'subdir',
dst=tmp_path / 'target.bag',
exclude_topics=[],
include_topics=[],
)
with patch.object(sys, 'argv', ['cvt', str(tmp_path / 'subdir')]), \
patch('builtins.print') as mock_print, \
patch('rosbags.convert.__main__.convert', side_effect=ConverterError('exc')), \
pytest.raises(SystemExit):
main()
mock_print.assert_called_with('ERROR: exc')
with patch('rosbags.convert.__main__.convert') as cvrt, \
patch.object(sys, 'argv', ['cvt',
str(tmp_path / 'ros1.bag'),
'--exclude-topic',
'/foo']):
main()
cvrt.assert_called_with(
src=tmp_path / 'ros1.bag',
dst=None,
exclude_topics=['/foo'],
include_topics=[],
)
def test_convert_1to2(tmp_path: Path) -> None:
"""Test conversion from rosbag1 to rosbag2."""
(tmp_path / 'subdir').mkdir()
(tmp_path / 'foo.bag').write_text('')
with pytest.raises(ConverterError, match='exists already'):
convert(Path('foo.bag'), tmp_path / 'subdir')
with patch('rosbags.convert.converter.Reader1') as reader, \
patch('rosbags.convert.converter.Writer2') as writer, \
patch('rosbags.convert.converter.get_types_from_msg', return_value={'typ': 'def'}), \
patch('rosbags.convert.converter.register_types') as register_types, \
patch('rosbags.convert.converter.ros1_to_cdr') as ros1_to_cdr:
readerinst = reader.return_value.__enter__.return_value
writerinst = writer.return_value.__enter__.return_value
connections = [
Connection(1, '/topic', 'typ', 'def', '', -1, ConnectionExtRosbag1(None, False), None),
Connection(2, '/topic', 'typ', 'def', '', -1, ConnectionExtRosbag1(None, True), None),
Connection(3, '/other', 'typ', 'def', '', -1, ConnectionExtRosbag1(None, False), None),
Connection(
4,
'/other',
'typ',
'def',
'',
-1,
ConnectionExtRosbag1('caller', False),
None,
),
]
wconnections = [
Connection(1, '/topic', 'typ', '', '', -1, ConnectionExtRosbag2('cdr', ''), None),
Connection(2, '/topic', 'typ', '', '', -1, ConnectionExtRosbag2('cdr', LATCH), None),
Connection(3, '/other', 'typ', '', '', -1, ConnectionExtRosbag2('cdr', ''), None),
]
readerinst.connections = [
connections[0],
connections[1],
connections[2],
connections[3],
]
readerinst.messages.return_value = [
(connections[0], 42, b'\x42'),
(connections[1], 43, b'\x43'),
(connections[2], 44, b'\x44'),
(connections[3], 45, b'\x45'),
]
writerinst.connections = []
def add_connection(*_: Any) -> Connection: # noqa: ANN401
"""Mock for Writer.add_connection."""
writerinst.connections = [
conn for _, conn in zip(range(len(writerinst.connections) + 1), wconnections)
]
return wconnections[len(writerinst.connections) - 1]
writerinst.add_connection.side_effect = add_connection
ros1_to_cdr.return_value = b'666'
convert(Path('foo.bag'), None)
reader.assert_called_with(Path('foo.bag'))
readerinst.messages.assert_called_with(connections=readerinst.connections)
writer.assert_called_with(Path('foo'))
writerinst.add_connection.assert_has_calls(
[
call('/topic', 'typ', 'cdr', ''),
call('/topic', 'typ', 'cdr', LATCH),
call('/other', 'typ', 'cdr', ''),
],
)
writerinst.write.assert_has_calls(
[
call(wconnections[0], 42, b'666'),
call(wconnections[1], 43, b'666'),
call(wconnections[2], 44, b'666'),
call(wconnections[2], 45, b'666'),
],
)
register_types.assert_called_with({'typ': 'def'})
ros1_to_cdr.assert_has_calls(
[
call(b'\x42', 'typ'),
call(b'\x43', 'typ'),
call(b'\x44', 'typ'),
call(b'\x45', 'typ'),
],
)
with pytest.raises(ConverterError, match='No connections left for conversion'):
convert(Path('foo.bag'), None, ['/topic', '/other'])
writerinst.connections.clear()
ros1_to_cdr.side_effect = KeyError('exc')
with pytest.raises(ConverterError, match='Converting rosbag: .*exc'):
convert(Path('foo.bag'), None)
writer.side_effect = WriterError('exc')
with pytest.raises(ConverterError, match='Writing destination bag: exc'):
convert(Path('foo.bag'), None)
reader.side_effect = ReaderError('exc')
with pytest.raises(ConverterError, match='Reading source bag: exc'):
convert(Path('foo.bag'), None)
def test_convert_2to1(tmp_path: Path) -> None:
"""Test conversion from rosbag2 to rosbag1."""
(tmp_path / 'subdir').mkdir()
(tmp_path / 'foo.bag').write_text('')
with pytest.raises(ConverterError, match='exists already'):
convert(Path('subdir'), tmp_path / 'foo.bag')
with patch('rosbags.convert.converter.Reader2') as reader, \
patch('rosbags.convert.converter.Writer1') as writer, \
patch('rosbags.convert.converter.cdr_to_ros1') as cdr_to_ros1:
readerinst = reader.return_value.__enter__.return_value
writerinst = writer.return_value.__enter__.return_value
connections = [
Connection(
1,
'/topic',
'std_msgs/msg/Bool',
'',
'',
-1,
ConnectionExtRosbag2('', ''),
None,
),
Connection(
2,
'/topic',
'std_msgs/msg/Bool',
'',
'',
-1,
ConnectionExtRosbag2('', LATCH),
None,
),
Connection(
3,
'/other',
'std_msgs/msg/Bool',
'',
'',
-1,
ConnectionExtRosbag2('', ''),
None,
),
Connection(
4,
'/other',
'std_msgs/msg/Bool',
'',
'',
-1,
ConnectionExtRosbag2('', '0'),
None,
),
]
wconnections = [
Connection(
1,
'/topic',
'std_msgs/msg/Bool',
'',
'8b94c1b53db61fb6aed406028ad6332a',
-1,
ConnectionExtRosbag1(None, False),
None,
),
Connection(
2,
'/topic',
'std_msgs/msg/Bool',
'',
'8b94c1b53db61fb6aed406028ad6332a',
-1,
ConnectionExtRosbag1(None, True),
None,
),
Connection(
3,
'/other',
'std_msgs/msg/Bool',
'',
'8b94c1b53db61fb6aed406028ad6332a',
-1,
ConnectionExtRosbag1(None, False),
None,
),
]
readerinst.connections = [
connections[0],
connections[1],
connections[2],
connections[3],
]
readerinst.messages.return_value = [
(connections[0], 42, b'\x42'),
(connections[1], 43, b'\x43'),
(connections[2], 44, b'\x44'),
(connections[3], 45, b'\x45'),
]
writerinst.connections = []
def add_connection(*_: Any) -> Connection: # noqa: ANN401
"""Mock for Writer.add_connection."""
writerinst.connections = [
conn for _, conn in zip(range(len(writerinst.connections) + 1), wconnections)
]
return wconnections[len(writerinst.connections) - 1]
writerinst.add_connection.side_effect = add_connection
cdr_to_ros1.return_value = b'666'
convert(Path('foo'), None)
reader.assert_called_with(Path('foo'))
reader.return_value.__enter__.return_value.messages.assert_called_with(
connections=readerinst.connections,
)
writer.assert_called_with(Path('foo.bag'))
writer.return_value.__enter__.return_value.add_connection.assert_has_calls(
[
call(
'/topic',
'std_msgs/msg/Bool',
'bool data\n',
'8b94c1b53db61fb6aed406028ad6332a',
None,
0,
),
call(
'/topic',
'std_msgs/msg/Bool',
'bool data\n',
'8b94c1b53db61fb6aed406028ad6332a',
None,
1,
),
call(
'/other',
'std_msgs/msg/Bool',
'bool data\n',
'8b94c1b53db61fb6aed406028ad6332a',
None,
0,
),
],
)
writer.return_value.__enter__.return_value.write.assert_has_calls(
[
call(wconnections[0], 42, b'666'),
call(wconnections[1], 43, b'666'),
call(wconnections[2], 44, b'666'),
call(wconnections[2], 45, b'666'),
],
)
cdr_to_ros1.assert_has_calls(
[
call(b'\x42', 'std_msgs/msg/Bool'),
call(b'\x43', 'std_msgs/msg/Bool'),
call(b'\x44', 'std_msgs/msg/Bool'),
call(b'\x45', 'std_msgs/msg/Bool'),
],
)
with pytest.raises(ConverterError, match='No connections left for conversion'):
convert(Path('foobag'), None, ['/topic', '/other'])
writerinst.connections.clear()
cdr_to_ros1.side_effect = KeyError('exc')
with pytest.raises(ConverterError, match='Converting rosbag: .*exc'):
convert(Path('foo'), None)
writer.side_effect = WriterError('exc')
with pytest.raises(ConverterError, match='Writing destination bag: exc'):
convert(Path('foo'), None)
reader.side_effect = ReaderError('exc')
with pytest.raises(ConverterError, match='Reading source bag: exc'):
convert(Path('foo'), None)

View File

@ -0,0 +1,262 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Reader tests."""
from __future__ import annotations
from typing import TYPE_CHECKING
from unittest.mock import patch
import pytest
from rosbags.highlevel import AnyReader, AnyReaderError
from rosbags.interfaces import Connection
from rosbags.rosbag1 import Writer as Writer1
from rosbags.rosbag2 import Writer as Writer2
if TYPE_CHECKING:
from pathlib import Path
from typing import Sequence
HEADER = b'\x00\x01\x00\x00'
@pytest.fixture()
def bags1(tmp_path: Path) -> list[Path]:
"""Test data fixture."""
paths = [
tmp_path / 'ros1_1.bag',
tmp_path / 'ros1_2.bag',
tmp_path / 'ros1_3.bag',
tmp_path / 'bad.bag',
]
with (Writer1(paths[0])) as writer:
topic1 = writer.add_connection('/topic1', 'std_msgs/msg/Int8')
topic2 = writer.add_connection('/topic2', 'std_msgs/msg/Int16')
writer.write(topic1, 1, b'\x01')
writer.write(topic2, 2, b'\x02\x00')
writer.write(topic1, 9, b'\x09')
with (Writer1(paths[1])) as writer:
topic1 = writer.add_connection('/topic1', 'std_msgs/msg/Int8')
writer.write(topic1, 5, b'\x05')
with (Writer1(paths[2])) as writer:
topic2 = writer.add_connection('/topic2', 'std_msgs/msg/Int16')
writer.write(topic2, 15, b'\x15\x00')
paths[3].touch()
return paths
@pytest.fixture()
def bags2(tmp_path: Path) -> list[Path]:
"""Test data fixture."""
paths = [
tmp_path / 'ros2_1',
tmp_path / 'bad',
]
with (Writer2(paths[0])) as writer:
topic1 = writer.add_connection('/topic1', 'std_msgs/msg/Int8')
topic2 = writer.add_connection('/topic2', 'std_msgs/msg/Int16')
writer.write(topic1, 1, HEADER + b'\x01')
writer.write(topic2, 2, HEADER + b'\x02\x00')
writer.write(topic1, 9, HEADER + b'\x09')
writer.write(topic1, 5, HEADER + b'\x05')
writer.write(topic2, 15, HEADER + b'\x15\x00')
paths[1].mkdir()
(paths[1] / 'metadata.yaml').write_text(':')
return paths
def test_anyreader1(bags1: Sequence[Path]) -> None: # pylint: disable=redefined-outer-name
"""Test AnyReader on rosbag1."""
# pylint: disable=too-many-statements
with pytest.raises(AnyReaderError, match='at least one'):
AnyReader([])
with pytest.raises(AnyReaderError, match='missing'):
AnyReader([bags1[0] / 'badname'])
reader = AnyReader(bags1)
with pytest.raises(AssertionError):
assert reader.topics
with pytest.raises(AssertionError):
next(reader.messages())
reader = AnyReader(bags1)
with pytest.raises(AnyReaderError, match='seems to be empty'):
reader.open()
assert all(not x.bio for x in reader.readers) # type: ignore[union-attr]
with AnyReader(bags1[:3]) as reader:
assert reader.duration == 15
assert reader.start_time == 1
assert reader.end_time == 16
assert reader.message_count == 5
assert list(reader.topics.keys()) == ['/topic1', '/topic2']
assert len(reader.topics['/topic1'].connections) == 2
assert reader.topics['/topic1'].msgcount == 3
assert len(reader.topics['/topic2'].connections) == 2
assert reader.topics['/topic2'].msgcount == 2
gen = reader.messages()
nxt = next(gen)
assert nxt[0].topic == '/topic1'
assert nxt[1:] == (1, b'\x01')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 1 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic2'
assert nxt[1:] == (2, b'\x02\x00')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 2 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic1'
assert nxt[1:] == (5, b'\x05')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 5 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic1'
assert nxt[1:] == (9, b'\x09')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 9 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic2'
assert nxt[1:] == (15, b'\x15\x00')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 21 # type: ignore
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(connections=reader.topics['/topic1'].connections)
nxt = next(gen)
assert nxt[0].topic == '/topic1'
nxt = next(gen)
assert nxt[0].topic == '/topic1'
nxt = next(gen)
assert nxt[0].topic == '/topic1'
with pytest.raises(StopIteration):
next(gen)
def test_anyreader2(bags2: list[Path]) -> None: # pylint: disable=redefined-outer-name
"""Test AnyReader on rosbag2."""
# pylint: disable=too-many-statements
with pytest.raises(AnyReaderError, match='multiple rosbag2'):
AnyReader(bags2)
with pytest.raises(AnyReaderError, match='YAML'):
AnyReader([bags2[1]])
with AnyReader([bags2[0]]) as reader:
assert reader.duration == 15
assert reader.start_time == 1
assert reader.end_time == 16
assert reader.message_count == 5
assert list(reader.topics.keys()) == ['/topic1', '/topic2']
assert len(reader.topics['/topic1'].connections) == 1
assert reader.topics['/topic1'].msgcount == 3
assert len(reader.topics['/topic2'].connections) == 1
assert reader.topics['/topic2'].msgcount == 2
gen = reader.messages()
nxt = next(gen)
assert nxt[0].topic == '/topic1'
assert nxt[1:] == (1, HEADER + b'\x01')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 1 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic2'
assert nxt[1:] == (2, HEADER + b'\x02\x00')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 2 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic1'
assert nxt[1:] == (5, HEADER + b'\x05')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 5 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic1'
assert nxt[1:] == (9, HEADER + b'\x09')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 9 # type: ignore
nxt = next(gen)
assert nxt[0].topic == '/topic2'
assert nxt[1:] == (15, HEADER + b'\x15\x00')
msg = reader.deserialize(nxt[2], nxt[0].msgtype)
assert msg.data == 21 # type: ignore
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(connections=reader.topics['/topic1'].connections)
nxt = next(gen)
assert nxt[0].topic == '/topic1'
nxt = next(gen)
assert nxt[0].topic == '/topic1'
nxt = next(gen)
assert nxt[0].topic == '/topic1'
with pytest.raises(StopIteration):
next(gen)
def test_anyreader2_autoregister(bags2: list[Path]) -> None: # pylint: disable=redefined-outer-name
"""Test AnyReader on rosbag2."""
class MockReader:
"""Mock reader."""
# pylint: disable=too-few-public-methods
def __init__(self, paths: list[Path]):
"""Initialize mock."""
_ = paths
self.metadata = {'storage_identifier': 'mcap'}
self.connections = [
Connection(
1,
'/foo',
'test_msg/msg/Foo',
'string foo',
'msg',
0,
None, # type: ignore
self,
),
Connection(
2,
'/bar',
'test_msg/msg/Bar',
'module test_msgs { module msg { struct Bar {string bar;}; }; };',
'idl',
0,
None, # type: ignore
self,
),
Connection(
3,
'/baz',
'test_msg/msg/Baz',
'',
'',
0,
None, # type: ignore
self,
),
]
def open(self) -> None:
"""Unused."""
with patch('rosbags.highlevel.anyreader.Reader2', MockReader), \
patch('rosbags.highlevel.anyreader.register_types') as mock_register_types:
AnyReader([bags2[0]]).open()
mock_register_types.assert_called_once()
assert mock_register_types.call_args[0][0] == {
'test_msg/msg/Foo': ([], [('foo', (1, 'string'))]),
'test_msgs/msg/Bar': ([], [('bar', (1, 'string'))]),
}

363
rosbags/tests/test_parse.py Normal file
View File

@ -0,0 +1,363 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Message definition parser tests."""
import pytest
from rosbags.typesys import (
TypesysError,
generate_msgdef,
get_types_from_idl,
get_types_from_msg,
register_types,
)
from rosbags.typesys.base import Nodetype
from rosbags.typesys.types import FIELDDEFS
MSG = """
# comment
bool b=true
int32 global=42
float32 f=1.33
string str= foo bar\t
std_msgs/Header header
std_msgs/msg/Bool bool
test_msgs/Bar sibling
float64 base
float64[] seq1
float64[] seq2
float64[4] array
"""
MSG_BOUNDS = """
int32[] unbounded_integer_array
int32[5] five_integers_array
int32[<=5] up_to_five_integers_array
string string_of_unbounded_size
string<=10 up_to_ten_characters_string
string[<=5] up_to_five_unbounded_strings
string<=10[] unbounded_array_of_string_up_to_ten_characters_each
string<=10[<=5] up_to_five_strings_up_to_ten_characters_each
"""
MSG_DEFAULTS = """
bool b false
uint8 i 42
uint8 o 0377
uint8 h 0xff
float32 y -314.15e-2
string name1 "John"
string name2 'Ringo'
int32[] samples [-200, -100, 0, 100, 200]
"""
MULTI_MSG = """
std_msgs/Header header
byte b
char c
Other[] o
================================================================================
MSG: std_msgs/Header
time time
================================================================================
MSG: test_msgs/Other
uint64[3] Header
uint32 static = 42
"""
CSTRING_CONFUSION_MSG = """
std_msgs/Header header
string s
================================================================================
MSG: std_msgs/Header
time time
"""
RELSIBLING_MSG = """
Header header
Other other
"""
IDL_LANG = """
// assign different literals and expressions
#ifndef FOO
#define FOO
#include <global>
#include "local"
const bool g_bool = TRUE;
const int8 g_int1 = 7;
const int8 g_int2 = 07;
const int8 g_int3 = 0x7;
const float64 g_float1 = 1.1;
const float64 g_float2 = 1e10;
const char g_char = 'c';
const string g_string1 = "";
const string<128> g_string2 = "str" "ing";
module Foo {
const int64 g_expr1 = ~1;
const int64 g_expr2 = 2 * 4;
};
#endif
"""
IDL = """
// comment in file
module test_msgs {
// comment in module
typedef std_msgs::msg::Bool Bool;
/**/ /***/ /* block comment */
/*
* block comment
*/
module msg {
// comment in submodule
typedef Bool Balias;
typedef test_msgs::msg::Bar Bar;
typedef double d4[4];
module Foo_Constants {
const int32 FOO = 32;
const int64 BAR = 64;
};
@comment(type="text", text="ignore")
struct Foo {
// comment in struct
std_msgs::msg::Header header;
Balias bool;
Bar sibling;
double/* comment in member declaration */x;
sequence<double> seq1;
sequence<double, 4> seq2;
d4 array;
};
};
struct Bar {
int i;
};
};
"""
IDL_STRINGARRAY = """
module test_msgs {
module msg {
typedef string string__3[3];
struct Strings {
string__3 values;
};
};
};
"""
def test_parse_empty_msg() -> None:
"""Test msg parser with empty message."""
ret = get_types_from_msg('', 'std_msgs/msg/Empty')
assert ret == {'std_msgs/msg/Empty': ([], [])}
def test_parse_bounds_msg() -> None:
"""Test msg parser."""
ret = get_types_from_msg(MSG_BOUNDS, 'test_msgs/msg/Foo')
assert ret == {
'test_msgs/msg/Foo': (
[],
[
('unbounded_integer_array', (4, ((1, 'int32'), None))),
('five_integers_array', (3, ((1, 'int32'), 5))),
('up_to_five_integers_array', (4, ((1, 'int32'), None))),
('string_of_unbounded_size', (1, 'string')),
('up_to_ten_characters_string', (1, 'string')),
('up_to_five_unbounded_strings', (4, ((1, 'string'), None))),
('unbounded_array_of_string_up_to_ten_characters_each', (4, ((1, 'string'), None))),
('up_to_five_strings_up_to_ten_characters_each', (4, ((1, 'string'), None))),
],
),
}
def test_parse_defaults_msg() -> None:
"""Test msg parser."""
ret = get_types_from_msg(MSG_DEFAULTS, 'test_msgs/msg/Foo')
assert ret == {
'test_msgs/msg/Foo': (
[],
[
('b', (1, 'bool')),
('i', (1, 'uint8')),
('o', (1, 'uint8')),
('h', (1, 'uint8')),
('y', (1, 'float32')),
('name1', (1, 'string')),
('name2', (1, 'string')),
('samples', (4, ((1, 'int32'), None))),
],
),
}
def test_parse_msg() -> None:
"""Test msg parser."""
with pytest.raises(TypesysError, match='Could not parse'):
get_types_from_msg('invalid', 'test_msgs/msg/Foo')
ret = get_types_from_msg(MSG, 'test_msgs/msg/Foo')
assert 'test_msgs/msg/Foo' in ret
consts, fields = ret['test_msgs/msg/Foo']
assert consts == [
('b', 'bool', True),
('global', 'int32', 42),
('f', 'float32', 1.33),
('str', 'string', 'foo bar'),
]
assert fields[0][0] == 'header'
assert fields[0][1][1] == 'std_msgs/msg/Header'
assert fields[1][0] == 'bool'
assert fields[1][1][1] == 'std_msgs/msg/Bool'
assert fields[2][0] == 'sibling'
assert fields[2][1][1] == 'test_msgs/msg/Bar'
assert fields[3][1][0] == Nodetype.BASE
assert fields[4][1][0] == Nodetype.SEQUENCE
assert fields[5][1][0] == Nodetype.SEQUENCE
assert fields[6][1][0] == Nodetype.ARRAY
def test_parse_multi_msg() -> None:
"""Test multi msg parser."""
ret = get_types_from_msg(MULTI_MSG, 'test_msgs/msg/Foo')
assert len(ret) == 3
assert 'test_msgs/msg/Foo' in ret
assert 'std_msgs/msg/Header' in ret
assert 'test_msgs/msg/Other' in ret
fields = ret['test_msgs/msg/Foo'][1]
assert fields[0][1][1] == 'std_msgs/msg/Header'
assert fields[1][1][1] == 'uint8'
assert fields[2][1][1] == 'uint8'
consts = ret['test_msgs/msg/Other'][0]
assert consts == [('static', 'uint32', 42)]
def test_parse_cstring_confusion() -> None:
"""Test if msg separator is confused with const string."""
ret = get_types_from_msg(CSTRING_CONFUSION_MSG, 'test_msgs/msg/Foo')
assert len(ret) == 2
assert 'test_msgs/msg/Foo' in ret
assert 'std_msgs/msg/Header' in ret
consts, fields = ret['test_msgs/msg/Foo']
assert consts == []
assert fields[0][1][1] == 'std_msgs/msg/Header'
assert fields[1][1][1] == 'string'
def test_parse_relative_siblings_msg() -> None:
"""Test relative siblings with msg parser."""
ret = get_types_from_msg(RELSIBLING_MSG, 'test_msgs/msg/Foo')
assert ret['test_msgs/msg/Foo'][1][0][1][1] == 'std_msgs/msg/Header'
assert ret['test_msgs/msg/Foo'][1][1][1][1] == 'test_msgs/msg/Other'
ret = get_types_from_msg(RELSIBLING_MSG, 'rel_msgs/msg/Foo')
assert ret['rel_msgs/msg/Foo'][1][0][1][1] == 'std_msgs/msg/Header'
assert ret['rel_msgs/msg/Foo'][1][1][1][1] == 'rel_msgs/msg/Other'
def test_parse_idl() -> None:
"""Test idl parser."""
ret = get_types_from_idl(IDL_LANG)
assert ret == {}
ret = get_types_from_idl(IDL)
assert 'test_msgs/msg/Foo' in ret
consts, fields = ret['test_msgs/msg/Foo']
assert consts == [('FOO', 'int32', 32), ('BAR', 'int64', 64)]
assert fields[0][0] == 'header'
assert fields[0][1][1] == 'std_msgs/msg/Header'
assert fields[1][0] == 'bool'
assert fields[1][1][1] == 'std_msgs/msg/Bool'
assert fields[2][0] == 'sibling'
assert fields[2][1][1] == 'test_msgs/msg/Bar'
assert fields[3][1][0] == Nodetype.BASE
assert fields[4][1][0] == Nodetype.SEQUENCE
assert fields[5][1][0] == Nodetype.SEQUENCE
assert fields[6][1][0] == Nodetype.ARRAY
assert 'test_msgs/Bar' in ret
consts, fields = ret['test_msgs/Bar']
assert consts == []
assert len(fields) == 1
assert fields[0][0] == 'i'
assert fields[0][1][1] == 'int'
ret = get_types_from_idl(IDL_STRINGARRAY)
consts, fields = ret['test_msgs/msg/Strings']
assert consts == []
assert len(fields) == 1
assert fields[0][0] == 'values'
assert fields[0][1] == (Nodetype.ARRAY, ((Nodetype.BASE, 'string'), 3))
def test_register_types() -> None:
"""Test type registeration."""
assert 'foo' not in FIELDDEFS
register_types({})
register_types({'foo': [[], [('b', (1, 'bool'))]]}) # type: ignore
assert 'foo' in FIELDDEFS
register_types({'std_msgs/msg/Header': [[], []]}) # type: ignore
assert len(FIELDDEFS['std_msgs/msg/Header'][1]) == 2
with pytest.raises(TypesysError, match='different definition'):
register_types({'foo': [[], [('x', (1, 'bool'))]]}) # type: ignore
def test_generate_msgdef() -> None:
"""Test message definition generator."""
res = generate_msgdef('std_msgs/msg/Header')
assert res == ('uint32 seq\ntime stamp\nstring frame_id\n', '2176decaecbce78abc3b96ef049fabed')
res = generate_msgdef('geometry_msgs/msg/PointStamped')
assert res[0].split(f'{"=" * 80}\n') == [
'std_msgs/Header header\ngeometry_msgs/Point point\n',
'MSG: std_msgs/Header\nuint32 seq\ntime stamp\nstring frame_id\n',
'MSG: geometry_msgs/Point\nfloat64 x\nfloat64 y\nfloat64 z\n',
]
res = generate_msgdef('geometry_msgs/msg/Twist')
assert res[0].split(f'{"=" * 80}\n') == [
'geometry_msgs/Vector3 linear\ngeometry_msgs/Vector3 angular\n',
'MSG: geometry_msgs/Vector3\nfloat64 x\nfloat64 y\nfloat64 z\n',
]
res = generate_msgdef('shape_msgs/msg/Mesh')
assert res[0].split(f'{"=" * 80}\n') == [
'shape_msgs/MeshTriangle[] triangles\ngeometry_msgs/Point[] vertices\n',
'MSG: shape_msgs/MeshTriangle\nuint32[3] vertex_indices\n',
'MSG: geometry_msgs/Point\nfloat64 x\nfloat64 y\nfloat64 z\n',
]
res = generate_msgdef('shape_msgs/msg/Plane')
assert res[0] == 'float64[4] coef\n'
res = generate_msgdef('sensor_msgs/msg/MultiEchoLaserScan')
assert len(res[0].split('=' * 80)) == 3
register_types(get_types_from_msg('time[3] times\nuint8 foo=42', 'foo_msgs/Timelist'))
res = generate_msgdef('foo_msgs/msg/Timelist')
assert res[0] == 'uint8 foo=42\ntime[3] times\n'
with pytest.raises(TypesysError, match='is unknown'):
generate_msgdef('foo_msgs/msg/Badname')

View File

@ -0,0 +1,751 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Reader tests."""
# pylint: disable=redefined-outer-name
from __future__ import annotations
import sqlite3
import struct
from io import BytesIO
from itertools import groupby
from pathlib import Path
from typing import TYPE_CHECKING
from unittest import mock
import pytest
import zstandard
from rosbags.rosbag2 import Reader, ReaderError, Writer
from .test_serde import MSG_JOINT, MSG_MAGN, MSG_MAGN_BIG, MSG_POLY
if TYPE_CHECKING:
from typing import BinaryIO, Iterable
from _pytest.fixtures import SubRequest
METADATA = """
rosbag2_bagfile_information:
version: 4
storage_identifier: sqlite3
relative_file_paths:
- db.db3{extension}
duration:
nanoseconds: 42
starting_time:
nanoseconds_since_epoch: 666
message_count: 4
topics_with_message_count:
- topic_metadata:
name: /poly
type: geometry_msgs/msg/Polygon
serialization_format: cdr
offered_qos_profiles: ""
message_count: 1
- topic_metadata:
name: /magn
type: sensor_msgs/msg/MagneticField
serialization_format: cdr
offered_qos_profiles: ""
message_count: 2
- topic_metadata:
name: /joint
type: trajectory_msgs/msg/JointTrajectory
serialization_format: cdr
offered_qos_profiles: ""
message_count: 1
compression_format: {compression_format}
compression_mode: {compression_mode}
"""
METADATA_EMPTY = """
rosbag2_bagfile_information:
version: 6
storage_identifier: sqlite3
relative_file_paths:
- db.db3
duration:
nanoseconds: 0
starting_time:
nanoseconds_since_epoch: 0
message_count: 0
topics_with_message_count: []
compression_format: ""
compression_mode: ""
files:
- duration:
nanoseconds: 0
message_count: 0
path: db.db3
starting_time:
nanoseconds_since_epoch: 0
custom_data:
key1: value1
key2: value2
"""
@pytest.fixture(params=['none', 'file', 'message'])
def bag(request: SubRequest, tmp_path: Path) -> Path:
"""Manually contruct bag."""
(tmp_path / 'metadata.yaml').write_text(
METADATA.format(
extension='' if request.param != 'file' else '.zstd',
compression_format='""' if request.param == 'none' else 'zstd',
compression_mode='""' if request.param == 'none' else request.param.upper(),
),
)
comp = zstandard.ZstdCompressor()
dbpath = tmp_path / 'db.db3'
dbh = sqlite3.connect(dbpath)
dbh.executescript(Writer.SQLITE_SCHEMA)
cur = dbh.cursor()
cur.execute(
'INSERT INTO topics VALUES(?, ?, ?, ?, ?)',
(1, '/poly', 'geometry_msgs/msg/Polygon', 'cdr', ''),
)
cur.execute(
'INSERT INTO topics VALUES(?, ?, ?, ?, ?)',
(2, '/magn', 'sensor_msgs/msg/MagneticField', 'cdr', ''),
)
cur.execute(
'INSERT INTO topics VALUES(?, ?, ?, ?, ?)',
(3, '/joint', 'trajectory_msgs/msg/JointTrajectory', 'cdr', ''),
)
cur.execute(
'INSERT INTO messages VALUES(?, ?, ?, ?)',
(1, 1, 666, MSG_POLY[0] if request.param != 'message' else comp.compress(MSG_POLY[0])),
)
cur.execute(
'INSERT INTO messages VALUES(?, ?, ?, ?)',
(2, 2, 708, MSG_MAGN[0] if request.param != 'message' else comp.compress(MSG_MAGN[0])),
)
cur.execute(
'INSERT INTO messages VALUES(?, ?, ?, ?)',
(
3,
2,
708,
MSG_MAGN_BIG[0] if request.param != 'message' else comp.compress(MSG_MAGN_BIG[0]),
),
)
cur.execute(
'INSERT INTO messages VALUES(?, ?, ?, ?)',
(4, 3, 708, MSG_JOINT[0] if request.param != 'message' else comp.compress(MSG_JOINT[0])),
)
dbh.commit()
if request.param == 'file':
with dbpath.open('rb') as ifh, (tmp_path / 'db.db3.zstd').open('wb') as ofh:
comp.copy_stream(ifh, ofh)
dbpath.unlink()
return tmp_path
def test_empty_bag(tmp_path: Path) -> None:
"""Test bags with broken fs layout."""
(tmp_path / 'metadata.yaml').write_text(METADATA_EMPTY)
dbpath = tmp_path / 'db.db3'
dbh = sqlite3.connect(dbpath)
dbh.executescript(Writer.SQLITE_SCHEMA)
with Reader(tmp_path) as reader:
assert reader.message_count == 0
assert reader.start_time == 2**63 - 1
assert reader.end_time == 0
assert reader.duration == 0
assert not list(reader.messages())
assert reader.custom_data['key1'] == 'value1'
assert reader.custom_data['key2'] == 'value2'
def test_reader(bag: Path) -> None:
"""Test reader and deserializer on simple bag."""
with Reader(bag) as reader:
assert reader.duration == 43
assert reader.start_time == 666
assert reader.end_time == 709
assert reader.message_count == 4
if reader.compression_mode:
assert reader.compression_format == 'zstd'
assert [x.id for x in reader.connections] == [1, 2, 3]
assert [*reader.topics.keys()] == ['/poly', '/magn', '/joint']
gen = reader.messages()
connection, timestamp, rawdata = next(gen)
assert connection.topic == '/poly'
assert connection.msgtype == 'geometry_msgs/msg/Polygon'
assert timestamp == 666
assert rawdata == MSG_POLY[0]
for idx in range(2):
connection, timestamp, rawdata = next(gen)
assert connection.topic == '/magn'
assert connection.msgtype == 'sensor_msgs/msg/MagneticField'
assert timestamp == 708
assert rawdata == [MSG_MAGN, MSG_MAGN_BIG][idx][0]
connection, timestamp, rawdata = next(gen)
assert connection.topic == '/joint'
assert connection.msgtype == 'trajectory_msgs/msg/JointTrajectory'
with pytest.raises(StopIteration):
next(gen)
def test_message_filters(bag: Path) -> None:
"""Test reader filters messages."""
with Reader(bag) as reader:
magn_connections = [x for x in reader.connections if x.topic == '/magn']
gen = reader.messages(connections=magn_connections)
connection, _, _ = next(gen)
assert connection.topic == '/magn'
connection, _, _ = next(gen)
assert connection.topic == '/magn'
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(start=667)
connection, _, _ = next(gen)
assert connection.topic == '/magn'
connection, _, _ = next(gen)
assert connection.topic == '/magn'
connection, _, _ = next(gen)
assert connection.topic == '/joint'
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(stop=667)
connection, _, _ = next(gen)
assert connection.topic == '/poly'
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(connections=magn_connections, stop=667)
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(start=666, stop=666)
with pytest.raises(StopIteration):
next(gen)
def test_user_errors(bag: Path) -> None:
"""Test user errors."""
reader = Reader(bag)
with pytest.raises(ReaderError, match='Rosbag is not open'):
next(reader.messages())
def test_failure_cases(tmp_path: Path) -> None:
"""Test bags with broken fs layout."""
with pytest.raises(ReaderError, match='not read metadata'):
Reader(tmp_path)
metadata = tmp_path / 'metadata.yaml'
metadata.write_text('')
with pytest.raises(ReaderError, match='not read'), \
mock.patch.object(Path, 'read_text', side_effect=PermissionError):
Reader(tmp_path)
metadata.write_text(' invalid:\nthis is not yaml')
with pytest.raises(ReaderError, match='not load YAML from'):
Reader(tmp_path)
metadata.write_text('foo:')
with pytest.raises(ReaderError, match='key is missing'):
Reader(tmp_path)
metadata.write_text(
METADATA.format(
extension='',
compression_format='""',
compression_mode='""',
).replace('version: 4', 'version: 999'),
)
with pytest.raises(ReaderError, match='version 999'):
Reader(tmp_path)
metadata.write_text(
METADATA.format(
extension='',
compression_format='""',
compression_mode='""',
).replace('sqlite3', 'hdf5'),
)
with pytest.raises(ReaderError, match='Storage plugin'):
Reader(tmp_path)
metadata.write_text(
METADATA.format(
extension='',
compression_format='""',
compression_mode='""',
),
)
with pytest.raises(ReaderError, match='files are missing'):
Reader(tmp_path)
(tmp_path / 'db.db3').write_text('')
metadata.write_text(
METADATA.format(
extension='',
compression_format='""',
compression_mode='""',
).replace('cdr', 'bson'),
)
with pytest.raises(ReaderError, match='Serialization format'):
Reader(tmp_path)
metadata.write_text(
METADATA.format(
extension='',
compression_format='"gz"',
compression_mode='"file"',
),
)
with pytest.raises(ReaderError, match='Compression format'):
Reader(tmp_path)
metadata.write_text(
METADATA.format(
extension='',
compression_format='""',
compression_mode='""',
),
)
with pytest.raises(ReaderError, match='not open database'), \
Reader(tmp_path) as reader:
next(reader.messages())
def write_record(bio: BinaryIO, opcode: int, records: Iterable[bytes]) -> None:
"""Write record."""
data = b''.join(records)
bio.write(bytes([opcode]) + struct.pack('<Q', len(data)) + data)
def make_string(text: str) -> bytes:
"""Serialize string."""
data = text.encode()
return struct.pack('<I', len(data)) + data
MCAP_HEADER = b'\x89MCAP0\r\n'
SCHEMAS = [
(
0x03,
(
struct.pack('<H', 1),
make_string('geometry_msgs/msg/Polygon'),
make_string('ros2msg'),
make_string('string foo'),
),
),
(
0x03,
(
struct.pack('<H', 2),
make_string('sensor_msgs/msg/MagneticField'),
make_string('ros2msg'),
make_string('string foo'),
),
),
(
0x03,
(
struct.pack('<H', 3),
make_string('trajectory_msgs/msg/JointTrajectory'),
make_string('ros2msg'),
make_string('string foo'),
),
),
]
CHANNELS = [
(
0x04,
(
struct.pack('<H', 1),
struct.pack('<H', 1),
make_string('/poly'),
make_string('cdr'),
make_string(''),
),
),
(
0x04,
(
struct.pack('<H', 2),
struct.pack('<H', 2),
make_string('/magn'),
make_string('cdr'),
make_string(''),
),
),
(
0x04,
(
struct.pack('<H', 3),
struct.pack('<H', 3),
make_string('/joint'),
make_string('cdr'),
make_string(''),
),
),
]
@pytest.fixture(
params=['unindexed', 'partially_indexed', 'indexed', 'chunked_unindexed', 'chunked_indexed'],
)
def bag_mcap(request: SubRequest, tmp_path: Path) -> Path:
"""Manually contruct mcap bag."""
# pylint: disable=too-many-locals
# pylint: disable=too-many-statements
(tmp_path / 'metadata.yaml').write_text(
METADATA.format(
extension='.mcap',
compression_format='""',
compression_mode='""',
).replace('sqlite3', 'mcap'),
)
path = tmp_path / 'db.db3.mcap'
bio: BinaryIO
messages: list[tuple[int, int, int]] = []
chunks = []
with path.open('wb') as bio:
realbio = bio
bio.write(MCAP_HEADER)
write_record(bio, 0x01, (make_string('ros2'), make_string('test_mcap')))
if request.param.startswith('chunked'):
bio = BytesIO()
messages = []
write_record(bio, *SCHEMAS[0])
write_record(bio, *CHANNELS[0])
messages.append((1, 666, bio.tell()))
write_record(
bio,
0x05,
(
struct.pack('<H', 1),
struct.pack('<I', 1),
struct.pack('<Q', 666),
struct.pack('<Q', 666),
MSG_POLY[0],
),
)
if request.param.startswith('chunked'):
assert isinstance(bio, BytesIO)
chunk_start = realbio.tell()
compression = make_string('')
uncompressed_size = struct.pack('<Q', len(bio.getbuffer()))
compressed_size = struct.pack('<Q', len(bio.getbuffer()))
write_record(
realbio,
0x06,
(
struct.pack('<Q', 666),
struct.pack('<Q', 666),
uncompressed_size,
struct.pack('<I', 0),
compression,
compressed_size,
bio.getbuffer(),
),
)
message_index_offsets = []
message_index_start = realbio.tell()
for channel_id, group in groupby(messages, key=lambda x: x[0]):
message_index_offsets.append((channel_id, realbio.tell()))
tpls = [y for x in group for y in x[1:]]
write_record(
realbio,
0x07,
(
struct.pack('<H', channel_id),
struct.pack('<I', 8 * len(tpls)),
struct.pack('<' + 'Q' * len(tpls), *tpls),
),
)
chunk = [
struct.pack('<Q', 666),
struct.pack('<Q', 666),
struct.pack('<Q', chunk_start),
struct.pack('<Q', message_index_start - chunk_start),
struct.pack('<I', 10 * len(message_index_offsets)),
*(struct.pack('<HQ', *x) for x in message_index_offsets),
struct.pack('<Q',
realbio.tell() - message_index_start),
compression,
compressed_size,
uncompressed_size,
]
chunks.append(chunk)
bio = BytesIO()
messages = []
write_record(bio, *SCHEMAS[1])
write_record(bio, *CHANNELS[1])
messages.append((2, 708, bio.tell()))
write_record(
bio,
0x05,
(
struct.pack('<H', 2),
struct.pack('<I', 1),
struct.pack('<Q', 708),
struct.pack('<Q', 708),
MSG_MAGN[0],
),
)
messages.append((2, 708, bio.tell()))
write_record(
bio,
0x05,
(
struct.pack('<H', 2),
struct.pack('<I', 2),
struct.pack('<Q', 708),
struct.pack('<Q', 708),
MSG_MAGN_BIG[0],
),
)
write_record(bio, *SCHEMAS[2])
write_record(bio, *CHANNELS[2])
messages.append((3, 708, bio.tell()))
write_record(
bio,
0x05,
(
struct.pack('<H', 3),
struct.pack('<I', 1),
struct.pack('<Q', 708),
struct.pack('<Q', 708),
MSG_JOINT[0],
),
)
if request.param.startswith('chunked'):
assert isinstance(bio, BytesIO)
chunk_start = realbio.tell()
compression = make_string('')
uncompressed_size = struct.pack('<Q', len(bio.getbuffer()))
compressed_size = struct.pack('<Q', len(bio.getbuffer()))
write_record(
realbio,
0x06,
(
struct.pack('<Q', 708),
struct.pack('<Q', 708),
uncompressed_size,
struct.pack('<I', 0),
compression,
compressed_size,
bio.getbuffer(),
),
)
message_index_offsets = []
message_index_start = realbio.tell()
for channel_id, group in groupby(messages, key=lambda x: x[0]):
message_index_offsets.append((channel_id, realbio.tell()))
tpls = [y for x in group for y in x[1:]]
write_record(
realbio,
0x07,
(
struct.pack('<H', channel_id),
struct.pack('<I', 8 * len(tpls)),
struct.pack('<' + 'Q' * len(tpls), *tpls),
),
)
chunk = [
struct.pack('<Q', 708),
struct.pack('<Q', 708),
struct.pack('<Q', chunk_start),
struct.pack('<Q', message_index_start - chunk_start),
struct.pack('<I', 10 * len(message_index_offsets)),
*(struct.pack('<HQ', *x) for x in message_index_offsets),
struct.pack('<Q',
realbio.tell() - message_index_start),
compression,
compressed_size,
uncompressed_size,
]
chunks.append(chunk)
bio = realbio
messages = []
if request.param in ['indexed', 'partially_indexed', 'chunked_indexed']:
summary_start = bio.tell()
for schema in SCHEMAS:
write_record(bio, *schema)
if request.param != 'partially_indexed':
for channel in CHANNELS:
write_record(bio, *channel)
if request.param == 'chunked_indexed':
for chunk in chunks:
write_record(bio, 0x08, chunk)
summary_offset_start = 0
write_record(bio, 0x0a, (b'ignored',))
write_record(
bio,
0x0b,
(
struct.pack('<Q', 4),
struct.pack('<H', 3),
struct.pack('<I', 3),
struct.pack('<I', 0),
struct.pack('<I', 0),
struct.pack('<I', 0 if request.param == 'indexed' else 1),
struct.pack('<Q', 666),
struct.pack('<Q', 708),
struct.pack('<I', 0),
),
)
write_record(bio, 0x0d, (b'ignored',))
write_record(bio, 0xff, (b'ignored',))
else:
summary_start = 0
summary_offset_start = 0
write_record(
bio,
0x02,
(
struct.pack('<Q', summary_start),
struct.pack('<Q', summary_offset_start),
struct.pack('<I', 0),
),
)
bio.write(MCAP_HEADER)
return tmp_path
def test_reader_mcap(bag_mcap: Path) -> None:
"""Test reader and deserializer on simple bag."""
with Reader(bag_mcap) as reader:
assert reader.duration == 43
assert reader.start_time == 666
assert reader.end_time == 709
assert reader.message_count == 4
if reader.compression_mode:
assert reader.compression_format == 'zstd'
assert [x.id for x in reader.connections] == [1, 2, 3]
assert [*reader.topics.keys()] == ['/poly', '/magn', '/joint']
gen = reader.messages()
connection, timestamp, rawdata = next(gen)
assert connection.topic == '/poly'
assert connection.msgtype == 'geometry_msgs/msg/Polygon'
assert timestamp == 666
assert rawdata == MSG_POLY[0]
for idx in range(2):
connection, timestamp, rawdata = next(gen)
assert connection.topic == '/magn'
assert connection.msgtype == 'sensor_msgs/msg/MagneticField'
assert timestamp == 708
assert rawdata == [MSG_MAGN, MSG_MAGN_BIG][idx][0]
connection, timestamp, rawdata = next(gen)
assert connection.topic == '/joint'
assert connection.msgtype == 'trajectory_msgs/msg/JointTrajectory'
with pytest.raises(StopIteration):
next(gen)
def test_message_filters_mcap(bag_mcap: Path) -> None:
"""Test reader filters messages."""
with Reader(bag_mcap) as reader:
magn_connections = [x for x in reader.connections if x.topic == '/magn']
gen = reader.messages(connections=magn_connections)
connection, _, _ = next(gen)
assert connection.topic == '/magn'
connection, _, _ = next(gen)
assert connection.topic == '/magn'
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(start=667)
connection, _, _ = next(gen)
assert connection.topic == '/magn'
connection, _, _ = next(gen)
assert connection.topic == '/magn'
connection, _, _ = next(gen)
assert connection.topic == '/joint'
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(stop=667)
connection, _, _ = next(gen)
assert connection.topic == '/poly'
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(connections=magn_connections, stop=667)
with pytest.raises(StopIteration):
next(gen)
gen = reader.messages(start=666, stop=666)
with pytest.raises(StopIteration):
next(gen)
def test_bag_mcap_files(tmp_path: Path) -> None:
"""Test bad mcap files."""
(tmp_path / 'metadata.yaml').write_text(
METADATA.format(
extension='.mcap',
compression_format='""',
compression_mode='""',
).replace('sqlite3', 'mcap'),
)
path = tmp_path / 'db.db3.mcap'
path.touch()
reader = Reader(tmp_path)
path.unlink()
with pytest.raises(ReaderError, match='Could not open'):
reader.open()
path.touch()
with pytest.raises(ReaderError, match='seems to be empty'):
Reader(tmp_path).open()
path.write_bytes(b'xxxxxxxx')
with pytest.raises(ReaderError, match='magic is invalid'):
Reader(tmp_path).open()
path.write_bytes(b'\x89MCAP0\r\n\xFF')
with pytest.raises(ReaderError, match='Unexpected record'):
Reader(tmp_path).open()
with path.open('wb') as bio:
bio.write(b'\x89MCAP0\r\n')
write_record(bio, 0x01, (make_string('ros1'), make_string('test_mcap')))
with pytest.raises(ReaderError, match='Profile is not'):
Reader(tmp_path).open()
with path.open('wb') as bio:
bio.write(b'\x89MCAP0\r\n')
write_record(bio, 0x01, (make_string('ros2'), make_string('test_mcap')))
with pytest.raises(ReaderError, match='File end magic is invalid'):
Reader(tmp_path).open()

View File

@ -0,0 +1,422 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Reader tests."""
from __future__ import annotations
from collections import defaultdict
from struct import pack
from typing import TYPE_CHECKING
from unittest.mock import patch
import pytest
from rosbags.rosbag1 import Reader, ReaderError
from rosbags.rosbag1.reader import IndexData
if TYPE_CHECKING:
from pathlib import Path
from typing import Any, Sequence, Union
def ser(data: Union[dict[str, Any], bytes]) -> bytes:
"""Serialize record header."""
if isinstance(data, dict):
fields = []
for key, value in data.items():
field = b'='.join([key.encode(), value])
fields.append(pack('<L', len(field)) + field)
data = b''.join(fields)
return pack('<L', len(data)) + data
def create_default_header() -> dict[str, bytes]:
"""Create empty rosbag header."""
return {
'op': b'\x03',
'conn_count': pack('<L', 0),
'chunk_count': pack('<L', 0),
}
def create_connection(
cid: int = 1,
topic: int = 0,
typ: int = 0,
) -> tuple[dict[str, bytes], dict[str, bytes]]:
"""Create connection record."""
return {
'op': b'\x07',
'conn': pack('<L', cid),
'topic': f'/topic{topic}'.encode(),
}, {
'type': f'foo_msgs/msg/Foo{typ}'.encode(),
'md5sum': b'AAAA',
'message_definition': b'MSGDEF',
}
def create_message(
cid: int = 1,
time: int = 0,
msg: int = 0,
) -> tuple[dict[str, Union[bytes, int]], bytes]:
"""Create message record."""
return {
'op': b'\x02',
'conn': cid,
'time': time,
}, f'MSGCONTENT{msg}'.encode()
def write_bag( # pylint: disable=too-many-locals
bag: Path,
header: dict[str, bytes],
chunks: Sequence[Any] = (),
) -> None:
"""Write bag file."""
magic = b'#ROSBAG V2.0\n'
pos = 13 + 4096
conn_count = 0
chunk_count = len(chunks or [])
chunks_bytes = b''
connections = b''
chunkinfos = b''
if chunks:
for chunk in chunks:
chunk_bytes = b''
start_time = 2**32 - 1
end_time = 0
counts: dict[int, int] = defaultdict(int)
index = {}
offset = 0
for head, data in chunk:
if head.get('op') == b'\x07':
conn_count += 1
add = ser(head) + ser(data)
chunk_bytes += add
connections += add
elif head.get('op') == b'\x02':
time = head['time']
head['time'] = pack('<LL', head['time'], 0)
conn = head['conn']
head['conn'] = pack('<L', head['conn'])
start_time = min([start_time, time])
end_time = max([end_time, time])
counts[conn] += 1
if conn not in index:
index[conn] = {
'count': 0,
'msgs': b'',
}
index[conn]['count'] += 1 # type: ignore
index[conn]['msgs'] += pack('<LLL', time, 0, offset) # type: ignore
add = ser(head) + ser(data)
chunk_bytes += add
offset = len(chunk_bytes)
else:
add = ser(head) + ser(data)
chunk_bytes += add
chunk_bytes = ser(
{
'op': b'\x05',
'compression': b'none',
'size': pack('<L', len(chunk_bytes)),
},
) + ser(chunk_bytes)
for conn, data in index.items():
chunk_bytes += ser(
{
'op': b'\x04',
'ver': pack('<L', 1),
'conn': pack('<L', conn),
'count': pack('<L', data['count']),
},
) + ser(data['msgs'])
chunks_bytes += chunk_bytes
chunkinfos += ser(
{
'op': b'\x06',
'ver': pack('<L', 1),
'chunk_pos': pack('<Q', pos),
'start_time': pack('<LL', start_time, 0),
'end_time': pack('<LL', end_time, 0),
'count': pack('<L', len(counts.keys())),
},
) + ser(b''.join([pack('<LL', x, y) for x, y in counts.items()]))
pos += len(chunk_bytes)
header['conn_count'] = pack('<L', conn_count)
header['chunk_count'] = pack('<L', chunk_count)
if 'index_pos' not in header:
header['index_pos'] = pack('<Q', pos)
header_bytes = ser(header)
header_bytes += b'\x20' * (4096 - len(header_bytes))
bag.write_bytes(b''.join([
magic,
header_bytes,
chunks_bytes,
connections,
chunkinfos,
]))
def test_indexdata() -> None:
"""Test IndexData sort sorder."""
x42_1_0 = IndexData(42, 1, 0)
x42_2_0 = IndexData(42, 2, 0)
x43_3_0 = IndexData(43, 3, 0)
assert not x42_1_0 < x42_2_0
assert x42_1_0 <= x42_2_0
assert x42_1_0 == x42_2_0
assert not x42_1_0 != x42_2_0 # noqa
assert x42_1_0 >= x42_2_0
assert not x42_1_0 > x42_2_0
assert x42_1_0 < x43_3_0
assert x42_1_0 <= x43_3_0
assert not x42_1_0 == x43_3_0 # noqa
assert x42_1_0 != x43_3_0
assert not x42_1_0 >= x43_3_0
assert not x42_1_0 > x43_3_0
def test_reader(tmp_path: Path) -> None: # pylint: disable=too-many-statements
"""Test reader and deserializer on simple bag."""
# empty bag
bag = tmp_path / 'test.bag'
write_bag(bag, create_default_header())
with Reader(bag) as reader:
assert reader.message_count == 0
assert reader.start_time == 2**63 - 1
assert reader.end_time == 0
assert reader.duration == 0
assert not list(reader.messages())
# empty bag, explicit encryptor
bag = tmp_path / 'test.bag'
write_bag(bag, {**create_default_header(), 'encryptor': b''})
with Reader(bag) as reader:
assert reader.message_count == 0
# single message
write_bag(
bag,
create_default_header(),
chunks=[[
create_connection(),
create_message(time=42),
]],
)
with Reader(bag) as reader:
assert reader.message_count == 1
assert reader.duration == 1
assert reader.start_time == 42 * 10**9
assert reader.end_time == 42 * 10**9 + 1
assert len(reader.topics.keys()) == 1
assert reader.topics['/topic0'].msgcount == 1
msgs = list(reader.messages())
assert len(msgs) == 1
# sorts by time on same topic
write_bag(
bag,
create_default_header(),
chunks=[
[
create_connection(),
create_message(time=10, msg=10),
create_message(time=5, msg=5),
],
],
)
with Reader(bag) as reader:
assert reader.message_count == 2
assert reader.duration == 5 * 10**9 + 1
assert reader.start_time == 5 * 10**9
assert reader.end_time == 10 * 10**9 + 1
assert len(reader.topics.keys()) == 1
assert reader.topics['/topic0'].msgcount == 2
msgs = list(reader.messages())
assert len(msgs) == 2
assert msgs[0][0].topic == '/topic0'
assert msgs[0][2] == b'MSGCONTENT5'
assert msgs[1][0].topic == '/topic0'
assert msgs[1][2] == b'MSGCONTENT10'
# sorts by time on different topic
write_bag(
bag,
create_default_header(),
chunks=[
[
create_connection(),
create_message(time=10, msg=10),
create_connection(cid=2, topic=2),
create_message(cid=2, time=5, msg=5),
],
],
)
with Reader(bag) as reader:
assert len(reader.topics.keys()) == 2
assert reader.topics['/topic0'].msgcount == 1
assert reader.topics['/topic2'].msgcount == 1
msgs = list(reader.messages())
assert len(msgs) == 2
assert msgs[0][2] == b'MSGCONTENT5'
assert msgs[1][2] == b'MSGCONTENT10'
connections = [x for x in reader.connections if x.topic == '/topic0']
msgs = list(reader.messages(connections))
assert len(msgs) == 1
assert msgs[0][2] == b'MSGCONTENT10'
msgs = list(reader.messages(start=7 * 10**9))
assert len(msgs) == 1
assert msgs[0][2] == b'MSGCONTENT10'
msgs = list(reader.messages(stop=7 * 10**9))
assert len(msgs) == 1
assert msgs[0][2] == b'MSGCONTENT5'
def test_user_errors(tmp_path: Path) -> None:
"""Test user errors."""
bag = tmp_path / 'test.bag'
write_bag(bag, create_default_header(), chunks=[[
create_connection(),
create_message(),
]])
reader = Reader(bag)
with pytest.raises(ReaderError, match='is not open'):
next(reader.messages())
def test_failure_cases(tmp_path: Path) -> None: # pylint: disable=too-many-statements
"""Test failure cases."""
bag = tmp_path / 'test.bag'
with pytest.raises(ReaderError, match='does not exist'):
Reader(bag).open()
bag.write_text('')
with patch('pathlib.Path.open', side_effect=IOError), \
pytest.raises(ReaderError, match='not open'):
Reader(bag).open()
with pytest.raises(ReaderError, match='empty'):
Reader(bag).open()
bag.write_text('#BADMAGIC')
with pytest.raises(ReaderError, match='magic is invalid'):
Reader(bag).open()
bag.write_text('#ROSBAG V3.0\n')
with pytest.raises(ReaderError, match='Bag version 300 is not supported.'):
Reader(bag).open()
bag.write_bytes(b'#ROSBAG V2.0\x0a\x00')
with pytest.raises(ReaderError, match='Header could not be read from file.'):
Reader(bag).open()
bag.write_bytes(b'#ROSBAG V2.0\x0a\x01\x00\x00\x00')
with pytest.raises(ReaderError, match='Header could not be read from file.'):
Reader(bag).open()
bag.write_bytes(b'#ROSBAG V2.0\x0a\x01\x00\x00\x00\x01')
with pytest.raises(ReaderError, match='Header field size could not be read.'):
Reader(bag).open()
bag.write_bytes(b'#ROSBAG V2.0\x0a\x04\x00\x00\x00\x01\x00\x00\x00')
with pytest.raises(ReaderError, match='Declared field size is too large for header.'):
Reader(bag).open()
bag.write_bytes(b'#ROSBAG V2.0\x0a\x05\x00\x00\x00\x01\x00\x00\x00x')
with pytest.raises(ReaderError, match='Header field could not be parsed.'):
Reader(bag).open()
write_bag(bag, {'encryptor': b'enc', **create_default_header()})
with pytest.raises(ReaderError, match='is not supported'):
Reader(bag).open()
write_bag(bag, {**create_default_header(), 'index_pos': pack('<Q', 0)})
with pytest.raises(ReaderError, match='Bag is not indexed'):
Reader(bag).open()
write_bag(bag, create_default_header(), chunks=[[
create_connection(),
create_message(),
]])
bag.write_bytes(bag.read_bytes().replace(b'none', b'COMP'))
with pytest.raises(ReaderError, match='Compression \'COMP\' is not supported.'):
Reader(bag).open()
write_bag(bag, create_default_header(), chunks=[[
create_connection(),
create_message(),
]])
bag.write_bytes(bag.read_bytes().replace(b'ver=\x01', b'ver=\x02'))
with pytest.raises(ReaderError, match='CHUNK_INFO version 2 is not supported.'):
Reader(bag).open()
write_bag(bag, create_default_header(), chunks=[[
create_connection(),
create_message(),
]])
bag.write_bytes(bag.read_bytes().replace(b'ver=\x01', b'ver=\x02', 1))
with pytest.raises(ReaderError, match='IDXDATA version 2 is not supported.'):
Reader(bag).open()
write_bag(bag, create_default_header(), chunks=[[
create_connection(),
create_message(),
]])
bag.write_bytes(bag.read_bytes().replace(b'op=\x02', b'op=\x00', 1))
with Reader(bag) as reader, \
pytest.raises(ReaderError, match='Expected to find message data.'):
next(reader.messages())
write_bag(bag, create_default_header(), chunks=[[
create_connection(),
create_message(),
]])
bag.write_bytes(bag.read_bytes().replace(b'op=\x03', b'op=\x02', 1))
with pytest.raises(ReaderError, match='Record of type \'MSGDATA\' is unexpected.'):
Reader(bag).open()
# bad uint8 field
write_bag(
bag,
create_default_header(),
chunks=[[
({}, {}),
create_connection(),
create_message(),
]],
)
with Reader(bag) as reader, \
pytest.raises(ReaderError, match='field \'op\''):
next(reader.messages())
# bad uint32, uint64, time field
for name in ('conn_count', 'chunk_pos', 'time'):
write_bag(bag, create_default_header(), chunks=[[create_connection(), create_message()]])
bag.write_bytes(bag.read_bytes().replace(name.encode(), b'x' * len(name), 1))
if name == 'time':
with pytest.raises(ReaderError, match=f'field \'{name}\''), \
Reader(bag) as reader:
next(reader.messages())
else:
with pytest.raises(ReaderError, match=f'field \'{name}\''):
Reader(bag).open()

View File

@ -0,0 +1,45 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Test full data roundtrip."""
from __future__ import annotations
from typing import TYPE_CHECKING
import pytest
from rosbags.rosbag2 import Reader, Writer
from rosbags.serde import deserialize_cdr, serialize_cdr
if TYPE_CHECKING:
from pathlib import Path
@pytest.mark.parametrize('mode', [*Writer.CompressionMode])
def test_roundtrip(mode: Writer.CompressionMode, tmp_path: Path) -> None:
"""Test full data roundtrip."""
class Foo: # pylint: disable=too-few-public-methods
"""Dummy class."""
data = 1.25
path = tmp_path / 'rosbag2'
wbag = Writer(path)
wbag.set_compression(mode, wbag.CompressionFormat.ZSTD)
with wbag:
msgtype = 'std_msgs/msg/Float64'
wconnection = wbag.add_connection('/test', msgtype)
wbag.write(wconnection, 42, serialize_cdr(Foo, msgtype))
rbag = Reader(path)
with rbag:
gen = rbag.messages()
rconnection, _, raw = next(gen)
assert rconnection.topic == wconnection.topic
assert rconnection.msgtype == wconnection.msgtype
assert rconnection.ext == wconnection.ext
msg = deserialize_cdr(raw, rconnection.msgtype)
assert getattr(msg, 'data', None) == Foo.data
with pytest.raises(StopIteration):
next(gen)

View File

@ -0,0 +1,44 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Test full data roundtrip."""
from __future__ import annotations
from typing import TYPE_CHECKING
import pytest
from rosbags.rosbag1 import Reader, Writer
from rosbags.serde import cdr_to_ros1, deserialize_cdr, ros1_to_cdr, serialize_cdr
if TYPE_CHECKING:
from pathlib import Path
from typing import Optional
@pytest.mark.parametrize('fmt', [None, Writer.CompressionFormat.BZ2, Writer.CompressionFormat.LZ4])
def test_roundtrip(tmp_path: Path, fmt: Optional[Writer.CompressionFormat]) -> None:
"""Test full data roundtrip."""
class Foo: # pylint: disable=too-few-public-methods
"""Dummy class."""
data = 1.25
path = tmp_path / 'test.bag'
wbag = Writer(path)
if fmt:
wbag.set_compression(fmt)
with wbag:
msgtype = 'std_msgs/msg/Float64'
conn = wbag.add_connection('/test', msgtype)
wbag.write(conn, 42, cdr_to_ros1(serialize_cdr(Foo, msgtype), msgtype))
rbag = Reader(path)
with rbag:
gen = rbag.messages()
connection, _, raw = next(gen)
msg = deserialize_cdr(ros1_to_cdr(raw, connection.msgtype), connection.msgtype)
assert getattr(msg, 'data', None) == Foo.data
with pytest.raises(StopIteration):
next(gen)

513
rosbags/tests/test_serde.py Normal file
View File

@ -0,0 +1,513 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Serializer and deserializer tests."""
from __future__ import annotations
from typing import TYPE_CHECKING
from unittest.mock import MagicMock, patch
import numpy
import pytest
from rosbags.serde import (
SerdeError,
cdr_to_ros1,
deserialize_cdr,
deserialize_ros1,
ros1_to_cdr,
serialize_cdr,
serialize_ros1,
)
from rosbags.serde.messages import get_msgdef
from rosbags.typesys import get_types_from_msg, register_types, types
from rosbags.typesys.types import builtin_interfaces__msg__Time as Time
from rosbags.typesys.types import geometry_msgs__msg__Polygon as Polygon
from rosbags.typesys.types import sensor_msgs__msg__MagneticField as MagneticField
from rosbags.typesys.types import std_msgs__msg__Header as Header
from .cdr import deserialize, serialize
if TYPE_CHECKING:
from typing import Any, Generator, Union
MSG_POLY = (
(
b'\x00\x01\x00\x00' # header
b'\x02\x00\x00\x00' # number of points = 2
b'\x00\x00\x80\x3f' # x = 1
b'\x00\x00\x00\x40' # y = 2
b'\x00\x00\x40\x40' # z = 3
b'\x00\x00\xa0\x3f' # x = 1.25
b'\x00\x00\x10\x40' # y = 2.25
b'\x00\x00\x50\x40' # z = 3.25
),
'geometry_msgs/msg/Polygon',
True,
)
MSG_MAGN = (
(
b'\x00\x01\x00\x00' # header
b'\xc4\x02\x00\x00\x00\x01\x00\x00' # timestamp = 708s 256ns
b'\x06\x00\x00\x00foo42\x00' # frameid 'foo42'
b'\x00\x00\x00\x00\x00\x00' # padding
b'\x00\x00\x00\x00\x00\x00\x60\x40' # x = 128
b'\x00\x00\x00\x00\x00\x00\x60\x40' # y = 128
b'\x00\x00\x00\x00\x00\x00\x60\x40' # z = 128
b'\x00\x00\x00\x00\x00\x00\xF0\x3F' # covariance matrix = 3x3 diag
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\xF0\x3F'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\xF0\x3F'
),
'sensor_msgs/msg/MagneticField',
True,
)
MSG_MAGN_BIG = (
(
b'\x00\x00\x00\x00' # header
b'\x00\x00\x02\xc4\x00\x00\x01\x00' # timestamp = 708s 256ns
b'\x00\x00\x00\x06foo42\x00' # frameid 'foo42'
b'\x00\x00\x00\x00\x00\x00' # padding
b'\x40\x60\x00\x00\x00\x00\x00\x00' # x = 128
b'\x40\x60\x00\x00\x00\x00\x00\x00' # y = 128
b'\x40\x60\x00\x00\x00\x00\x00\x00' # z = 128
b'\x3F\xF0\x00\x00\x00\x00\x00\x00' # covariance matrix = 3x3 diag
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x3F\xF0\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x3F\xF0\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00' # garbage
),
'sensor_msgs/msg/MagneticField',
False,
)
MSG_JOINT = (
(
b'\x00\x01\x00\x00' # header
b'\xc4\x02\x00\x00\x00\x01\x00\x00' # timestamp = 708s 256ns
b'\x04\x00\x00\x00bar\x00' # frameid 'bar'
b'\x02\x00\x00\x00' # number of strings
b'\x02\x00\x00\x00a\x00' # string 'a'
b'\x00\x00' # padding
b'\x02\x00\x00\x00b\x00' # string 'b'
b'\x00\x00' # padding
b'\x00\x00\x00\x00' # number of points
b'\x00\x00\x00' # garbage
),
'trajectory_msgs/msg/JointTrajectory',
True,
)
MESSAGES = [MSG_POLY, MSG_MAGN, MSG_MAGN_BIG, MSG_JOINT]
STATIC_64_64 = """
uint64[2] u64
"""
STATIC_64_16 = """
uint64 u64
uint16 u16
"""
STATIC_16_64 = """
uint16 u16
uint64 u64
"""
DYNAMIC_64_64 = """
uint64[] u64
"""
DYNAMIC_64_B_64 = """
uint64 u64
bool b
float64 f64
"""
DYNAMIC_64_S = """
uint64 u64
string s
"""
DYNAMIC_S_64 = """
string s
uint64 u64
"""
CUSTOM = """
string base_str
float32 base_f32
test_msgs/msg/static_64_64 msg_s66
test_msgs/msg/static_64_16 msg_s61
test_msgs/msg/static_16_64 msg_s16
test_msgs/msg/dynamic_64_64 msg_d66
test_msgs/msg/dynamic_64_b_64 msg_d6b6
test_msgs/msg/dynamic_64_s msg_d6s
test_msgs/msg/dynamic_s_64 msg_ds6
string[2] arr_base_str
float32[2] arr_base_f32
test_msgs/msg/static_64_64[2] arr_msg_s66
test_msgs/msg/static_64_16[2] arr_msg_s61
test_msgs/msg/static_16_64[2] arr_msg_s16
test_msgs/msg/dynamic_64_64[2] arr_msg_d66
test_msgs/msg/dynamic_64_b_64[2] arr_msg_d6b6
test_msgs/msg/dynamic_64_s[2] arr_msg_d6s
test_msgs/msg/dynamic_s_64[2] arr_msg_ds6
string[] seq_base_str
float32[] seq_base_f32
test_msgs/msg/static_64_64[] seq_msg_s66
test_msgs/msg/static_64_16[] seq_msg_s61
test_msgs/msg/static_16_64[] seq_msg_s16
test_msgs/msg/dynamic_64_64[] seq_msg_d66
test_msgs/msg/dynamic_64_b_64[] seq_msg_d6b6
test_msgs/msg/dynamic_64_s[] seq_msg_d6s
test_msgs/msg/dynamic_s_64[] seq_msg_ds6
"""
SU64_B = """
uint64[] su64
bool b
"""
SU64_U64 = """
uint64[] su64
uint64 u64
"""
SMSG_U64 = """
su64_u64[] seq
uint64 u64
"""
@pytest.fixture()
def _comparable() -> Generator[None, None, None]:
"""Make messages containing numpy arrays comparable.
Notes:
This solution is necessary as numpy.ndarray is not directly patchable.
"""
frombuffer = numpy.frombuffer
def arreq(self: MagicMock, other: Union[MagicMock, Any]) -> bool:
lhs = self._mock_wraps # pylint: disable=protected-access
rhs = getattr(other, '_mock_wraps', other)
return (lhs == rhs).all() # type: ignore
class CNDArray(MagicMock):
"""Mock ndarray."""
def __init__(self, *args: Any, **kwargs: Any): # noqa: ANN401
super().__init__(*args, **kwargs)
self.dtype = kwargs['wraps'].dtype
self.reshape = kwargs['wraps'].reshape
self.__eq__ = arreq # type: ignore
def byteswap(self, *args: Any) -> CNDArray: # noqa: ANN401
"""Wrap return value also in mock."""
return CNDArray(wraps=self._mock_wraps.byteswap(*args))
def wrap_frombuffer(*args: Any, **kwargs: Any) -> CNDArray: # noqa: ANN401
return CNDArray(wraps=frombuffer(*args, **kwargs))
with patch.object(numpy, 'frombuffer', side_effect=wrap_frombuffer):
yield
@pytest.mark.parametrize('message', MESSAGES)
def test_serde(message: tuple[bytes, str, bool]) -> None:
"""Test serialization deserialization roundtrip."""
rawdata, typ, is_little = message
serdeser = serialize_cdr(deserialize_cdr(rawdata, typ), typ, is_little)
assert serdeser == serialize(deserialize(rawdata, typ), typ, is_little)
assert serdeser == rawdata[:len(serdeser)]
assert len(rawdata) - len(serdeser) < 4
assert all(x == 0 for x in rawdata[len(serdeser):])
if rawdata[1] == 1:
rawdata = cdr_to_ros1(rawdata, typ)
serdeser = serialize_ros1(deserialize_ros1(rawdata, typ), typ)
assert serdeser == rawdata
@pytest.mark.usefixtures('_comparable')
def test_deserializer() -> None:
"""Test deserializer."""
msg = deserialize_cdr(*MSG_POLY[:2])
assert msg == deserialize(*MSG_POLY[:2])
assert isinstance(msg, Polygon)
assert len(msg.points) == 2
assert msg.points[0].x == 1
assert msg.points[0].y == 2
assert msg.points[0].z == 3
assert msg.points[1].x == 1.25
assert msg.points[1].y == 2.25
assert msg.points[1].z == 3.25
msg_ros1 = deserialize_ros1(cdr_to_ros1(*MSG_POLY[:2]), MSG_POLY[1])
assert msg_ros1 == msg
msg = deserialize_cdr(*MSG_MAGN[:2])
assert msg == deserialize(*MSG_MAGN[:2])
assert isinstance(msg, MagneticField)
assert 'MagneticField' in repr(msg)
assert msg.header.stamp.sec == 708
assert msg.header.stamp.nanosec == 256
assert msg.header.frame_id == 'foo42'
field = msg.magnetic_field
assert (field.x, field.y, field.z) == (128., 128., 128.)
diag = numpy.diag(msg.magnetic_field_covariance.reshape(3, 3))
assert (diag == [1., 1., 1.]).all()
msg_ros1 = deserialize_ros1(cdr_to_ros1(*MSG_MAGN[:2]), MSG_MAGN[1])
assert msg_ros1 == msg
msg_big = deserialize_cdr(*MSG_MAGN_BIG[:2])
assert msg_big == deserialize(*MSG_MAGN_BIG[:2])
assert isinstance(msg_big, MagneticField)
assert msg.magnetic_field == msg_big.magnetic_field
@pytest.mark.usefixtures('_comparable')
def test_serializer() -> None:
"""Test serializer."""
class Foo: # pylint: disable=too-few-public-methods
"""Dummy class."""
data = 7
msg = Foo()
ret = serialize_cdr(msg, 'std_msgs/msg/Int8', True)
assert ret == serialize(msg, 'std_msgs/msg/Int8', True)
assert ret == b'\x00\x01\x00\x00\x07'
ret = serialize_cdr(msg, 'std_msgs/msg/Int8', False)
assert ret == serialize(msg, 'std_msgs/msg/Int8', False)
assert ret == b'\x00\x00\x00\x00\x07'
ret = serialize_cdr(msg, 'std_msgs/msg/Int16', True)
assert ret == serialize(msg, 'std_msgs/msg/Int16', True)
assert ret == b'\x00\x01\x00\x00\x07\x00'
ret = serialize_cdr(msg, 'std_msgs/msg/Int16', False)
assert ret == serialize(msg, 'std_msgs/msg/Int16', False)
assert ret == b'\x00\x00\x00\x00\x00\x07'
@pytest.mark.usefixtures('_comparable')
def test_serializer_errors() -> None:
"""Test seralizer with broken messages."""
class Foo: # pylint: disable=too-few-public-methods
"""Dummy class."""
coef: numpy.ndarray[Any, numpy.dtype[numpy.int_]] = numpy.array([1, 2, 3, 4])
msg = Foo()
ret = serialize_cdr(msg, 'shape_msgs/msg/Plane', True)
assert ret == serialize(msg, 'shape_msgs/msg/Plane', True)
msg.coef = numpy.array([1, 2, 3, 4, 4])
with pytest.raises(SerdeError, match='array length'):
serialize_cdr(msg, 'shape_msgs/msg/Plane', True)
@pytest.mark.usefixtures('_comparable')
def test_custom_type() -> None:
"""Test custom type."""
cname = 'test_msgs/msg/custom'
register_types(dict(get_types_from_msg(STATIC_64_64, 'test_msgs/msg/static_64_64')))
register_types(dict(get_types_from_msg(STATIC_64_16, 'test_msgs/msg/static_64_16')))
register_types(dict(get_types_from_msg(STATIC_16_64, 'test_msgs/msg/static_16_64')))
register_types(dict(get_types_from_msg(DYNAMIC_64_64, 'test_msgs/msg/dynamic_64_64')))
register_types(dict(get_types_from_msg(DYNAMIC_64_B_64, 'test_msgs/msg/dynamic_64_b_64')))
register_types(dict(get_types_from_msg(DYNAMIC_64_S, 'test_msgs/msg/dynamic_64_s')))
register_types(dict(get_types_from_msg(DYNAMIC_S_64, 'test_msgs/msg/dynamic_s_64')))
register_types(dict(get_types_from_msg(CUSTOM, cname)))
static_64_64 = get_msgdef('test_msgs/msg/static_64_64', types).cls
static_64_16 = get_msgdef('test_msgs/msg/static_64_16', types).cls
static_16_64 = get_msgdef('test_msgs/msg/static_16_64', types).cls
dynamic_64_64 = get_msgdef('test_msgs/msg/dynamic_64_64', types).cls
dynamic_64_b_64 = get_msgdef('test_msgs/msg/dynamic_64_b_64', types).cls
dynamic_64_s = get_msgdef('test_msgs/msg/dynamic_64_s', types).cls
dynamic_s_64 = get_msgdef('test_msgs/msg/dynamic_s_64', types).cls
custom = get_msgdef('test_msgs/msg/custom', types).cls
msg = custom(
'str',
1.5,
static_64_64(numpy.array([64, 64], dtype=numpy.uint64)),
static_64_16(64, 16),
static_16_64(16, 64),
dynamic_64_64(numpy.array([33, 33], dtype=numpy.uint64)),
dynamic_64_b_64(64, True, 1.25),
dynamic_64_s(64, 's'),
dynamic_s_64('s', 64),
# arrays
['str_1', ''],
numpy.array([1.5, 0.75], dtype=numpy.float32),
[
static_64_64(numpy.array([64, 64], dtype=numpy.uint64)),
static_64_64(numpy.array([64, 64], dtype=numpy.uint64)),
],
[static_64_16(64, 16), static_64_16(64, 16)],
[static_16_64(16, 64), static_16_64(16, 64)],
[
dynamic_64_64(numpy.array([33, 33], dtype=numpy.uint64)),
dynamic_64_64(numpy.array([33, 33], dtype=numpy.uint64)),
],
[
dynamic_64_b_64(64, True, 1.25),
dynamic_64_b_64(64, True, 1.25),
],
[dynamic_64_s(64, 's'), dynamic_64_s(64, 's')],
[dynamic_s_64('s', 64), dynamic_s_64('s', 64)],
# sequences
['str_1', ''],
numpy.array([1.5, 0.75], dtype=numpy.float32),
[
static_64_64(numpy.array([64, 64], dtype=numpy.uint64)),
static_64_64(numpy.array([64, 64], dtype=numpy.uint64)),
],
[static_64_16(64, 16), static_64_16(64, 16)],
[static_16_64(16, 64), static_16_64(16, 64)],
[
dynamic_64_64(numpy.array([33, 33], dtype=numpy.uint64)),
dynamic_64_64(numpy.array([33, 33], dtype=numpy.uint64)),
],
[
dynamic_64_b_64(64, True, 1.25),
dynamic_64_b_64(64, True, 1.25),
],
[dynamic_64_s(64, 's'), dynamic_64_s(64, 's')],
[dynamic_s_64('s', 64), dynamic_s_64('s', 64)],
)
res = deserialize_cdr(serialize_cdr(msg, cname), cname)
assert res == deserialize(serialize(msg, cname), cname)
assert res == msg
res = deserialize_ros1(serialize_ros1(msg, cname), cname)
assert res == msg
def test_ros1_to_cdr() -> None:
"""Test ROS1 to CDR conversion."""
msgtype = 'test_msgs/msg/static_16_64'
register_types(dict(get_types_from_msg(STATIC_16_64, msgtype)))
msg_ros = (b'\x01\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x02')
msg_cdr = (
b'\x00\x01\x00\x00'
b'\x01\x00'
b'\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x02'
)
assert ros1_to_cdr(msg_ros, msgtype) == msg_cdr
assert serialize_cdr(deserialize_ros1(msg_ros, msgtype), msgtype) == msg_cdr
msgtype = 'test_msgs/msg/dynamic_s_64'
register_types(dict(get_types_from_msg(DYNAMIC_S_64, msgtype)))
msg_ros = (b'\x01\x00\x00\x00X'
b'\x00\x00\x00\x00\x00\x00\x00\x02')
msg_cdr = (
b'\x00\x01\x00\x00'
b'\x02\x00\x00\x00X\x00'
b'\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x02'
)
assert ros1_to_cdr(msg_ros, msgtype) == msg_cdr
assert serialize_cdr(deserialize_ros1(msg_ros, msgtype), msgtype) == msg_cdr
def test_cdr_to_ros1() -> None:
"""Test CDR to ROS1 conversion."""
msgtype = 'test_msgs/msg/static_16_64'
register_types(dict(get_types_from_msg(STATIC_16_64, msgtype)))
msg_ros = (b'\x01\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x02')
msg_cdr = (
b'\x00\x01\x00\x00'
b'\x01\x00'
b'\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x02'
)
assert cdr_to_ros1(msg_cdr, msgtype) == msg_ros
assert serialize_ros1(deserialize_cdr(msg_cdr, msgtype), msgtype) == msg_ros
msgtype = 'test_msgs/msg/dynamic_s_64'
register_types(dict(get_types_from_msg(DYNAMIC_S_64, msgtype)))
msg_ros = (b'\x01\x00\x00\x00X'
b'\x00\x00\x00\x00\x00\x00\x00\x02')
msg_cdr = (
b'\x00\x01\x00\x00'
b'\x02\x00\x00\x00X\x00'
b'\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x02'
)
assert cdr_to_ros1(msg_cdr, msgtype) == msg_ros
assert serialize_ros1(deserialize_cdr(msg_cdr, msgtype), msgtype) == msg_ros
header = Header(stamp=Time(42, 666), frame_id='frame')
msg_ros = cdr_to_ros1(serialize_cdr(header, 'std_msgs/msg/Header'), 'std_msgs/msg/Header')
assert msg_ros == b'\x00\x00\x00\x00*\x00\x00\x00\x9a\x02\x00\x00\x05\x00\x00\x00frame'
@pytest.mark.usefixtures('_comparable')
def test_padding_empty_sequence() -> None:
"""Test empty sequences do not add item padding."""
register_types(dict(get_types_from_msg(SU64_B, 'test_msgs/msg/su64_b')))
su64_b = get_msgdef('test_msgs/msg/su64_b', types).cls
msg = su64_b(numpy.array([], dtype=numpy.uint64), True)
cdr = serialize_cdr(msg, msg.__msgtype__)
assert cdr[4:] == b'\x00\x00\x00\x00\x01'
ros1 = cdr_to_ros1(cdr, msg.__msgtype__)
assert ros1 == cdr[4:]
assert ros1_to_cdr(ros1, msg.__msgtype__) == cdr
assert deserialize_cdr(cdr, msg.__msgtype__) == msg
@pytest.mark.usefixtures('_comparable')
def test_align_after_empty_sequence() -> None:
"""Test alignment after empty sequences."""
register_types(dict(get_types_from_msg(SU64_U64, 'test_msgs/msg/su64_u64')))
register_types(dict(get_types_from_msg(SMSG_U64, 'test_msgs/msg/smsg_u64')))
su64_u64 = get_msgdef('test_msgs/msg/su64_u64', types).cls
smsg_u64 = get_msgdef('test_msgs/msg/smsg_u64', types).cls
msg1 = su64_u64(numpy.array([], dtype=numpy.uint64), 42)
msg2 = smsg_u64([], 42)
cdr = serialize_cdr(msg1, msg1.__msgtype__)
assert cdr[4:] == b'\x00\x00\x00\x00\x00\x00\x00\x00\x2a\x00\x00\x00\x00\x00\x00\x00'
assert serialize_cdr(msg2, msg2.__msgtype__) == cdr
ros1 = cdr_to_ros1(cdr, msg1.__msgtype__)
assert ros1 == b'\x00\x00\x00\x00\x2a\x00\x00\x00\x00\x00\x00\x00'
assert cdr_to_ros1(cdr, msg2.__msgtype__) == ros1
assert ros1_to_cdr(ros1, msg1.__msgtype__) == cdr
assert deserialize_cdr(cdr, msg1.__msgtype__) == msg1
assert deserialize_cdr(cdr, msg2.__msgtype__) == msg2

View File

@ -0,0 +1,127 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Writer tests."""
from __future__ import annotations
from typing import TYPE_CHECKING
import pytest
from rosbags.interfaces import Connection, ConnectionExtRosbag2
from rosbags.rosbag2 import Writer, WriterError
if TYPE_CHECKING:
from pathlib import Path
def test_writer(tmp_path: Path) -> None:
"""Test Writer."""
path = tmp_path / 'rosbag2'
with Writer(path) as bag:
connection = bag.add_connection('/test', 'std_msgs/msg/Int8')
bag.write(connection, 42, b'\x00')
bag.write(connection, 666, b'\x01' * 4096)
assert (path / 'metadata.yaml').exists()
assert (path / 'rosbag2.db3').exists()
size = (path / 'rosbag2.db3').stat().st_size
path = tmp_path / 'compress_none'
bag = Writer(path)
bag.set_compression(bag.CompressionMode.NONE, bag.CompressionFormat.ZSTD)
with bag:
connection = bag.add_connection('/test', 'std_msgs/msg/Int8')
bag.write(connection, 42, b'\x00')
bag.write(connection, 666, b'\x01' * 4096)
assert (path / 'metadata.yaml').exists()
assert (path / 'compress_none.db3').exists()
assert size == (path / 'compress_none.db3').stat().st_size
path = tmp_path / 'compress_file'
bag = Writer(path)
bag.set_compression(bag.CompressionMode.FILE, bag.CompressionFormat.ZSTD)
with bag:
connection = bag.add_connection('/test', 'std_msgs/msg/Int8')
bag.write(connection, 42, b'\x00')
bag.write(connection, 666, b'\x01' * 4096)
assert (path / 'metadata.yaml').exists()
assert not (path / 'compress_file.db3').exists()
assert (path / 'compress_file.db3.zstd').exists()
path = tmp_path / 'compress_message'
bag = Writer(path)
bag.set_compression(bag.CompressionMode.MESSAGE, bag.CompressionFormat.ZSTD)
with bag:
connection = bag.add_connection('/test', 'std_msgs/msg/Int8')
bag.write(connection, 42, b'\x00')
bag.write(connection, 666, b'\x01' * 4096)
assert (path / 'metadata.yaml').exists()
assert (path / 'compress_message.db3').exists()
assert size > (path / 'compress_message.db3').stat().st_size
path = tmp_path / 'with_custom_data'
bag = Writer(path)
bag.open()
bag.set_custom_data('key1', 'value1')
with pytest.raises(WriterError, match='non-string value'):
bag.set_custom_data('key1', 42) # type: ignore
bag.close()
assert b'key1: value1' in (path / 'metadata.yaml').read_bytes()
def test_failure_cases(tmp_path: Path) -> None:
"""Test writer failure cases."""
with pytest.raises(WriterError, match='exists'):
Writer(tmp_path)
bag = Writer(tmp_path / 'race')
(tmp_path / 'race').mkdir()
with pytest.raises(WriterError, match='exists'):
bag.open()
bag = Writer(tmp_path / 'compress_after_open')
bag.open()
with pytest.raises(WriterError, match='already open'):
bag.set_compression(bag.CompressionMode.FILE, bag.CompressionFormat.ZSTD)
bag = Writer(tmp_path / 'topic')
with pytest.raises(WriterError, match='was not opened'):
bag.add_connection('/tf', 'tf_msgs/msg/tf2')
bag = Writer(tmp_path / 'write')
with pytest.raises(WriterError, match='was not opened'):
bag.write(
Connection(
1,
'/tf',
'tf_msgs/msg/tf2',
'',
'',
0,
ConnectionExtRosbag2('cdr', ''),
None,
),
0,
b'',
)
bag = Writer(tmp_path / 'topic')
bag.open()
bag.add_connection('/tf', 'tf_msgs/msg/tf2')
with pytest.raises(WriterError, match='only be added once'):
bag.add_connection('/tf', 'tf_msgs/msg/tf2')
bag = Writer(tmp_path / 'notopic')
bag.open()
connection = Connection(
1,
'/tf',
'tf_msgs/msg/tf2',
'',
'',
0,
ConnectionExtRosbag2('cdr', ''),
None,
)
with pytest.raises(WriterError, match='unknown connection'):
bag.write(connection, 42, b'\x00')

View File

@ -0,0 +1,201 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Writer tests."""
from __future__ import annotations
from typing import TYPE_CHECKING
from unittest.mock import Mock
import pytest
from rosbags.rosbag1 import Writer, WriterError
if TYPE_CHECKING:
from pathlib import Path
from typing import Optional
def test_no_overwrite(tmp_path: Path) -> None:
"""Test writer does not touch existing files."""
path = tmp_path / 'test.bag'
path.write_text('foo')
with pytest.raises(WriterError, match='exists'):
Writer(path).open()
path.unlink()
writer = Writer(path)
path.write_text('foo')
with pytest.raises(WriterError, match='exists'):
writer.open()
def test_empty(tmp_path: Path) -> None:
"""Test empty bag."""
path = tmp_path / 'test.bag'
with Writer(path):
pass
data = path.read_bytes()
assert len(data) == 13 + 4096
def test_add_connection(tmp_path: Path) -> None:
"""Test adding of connections."""
path = tmp_path / 'test.bag'
with pytest.raises(WriterError, match='not opened'):
Writer(path).add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
with Writer(path) as writer:
res = writer.add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
assert res.id == 0
data = path.read_bytes()
assert data.count(b'MESSAGE_DEFINITION') == 2
assert data.count(b'HASH') == 2
path.unlink()
with Writer(path) as writer:
res = writer.add_connection('/foo', 'std_msgs/msg/Int8')
assert res.id == 0
data = path.read_bytes()
assert data.count(b'int8 data') == 2
assert data.count(b'27ffa0c9c4b8fb8492252bcad9e5c57b') == 2
path.unlink()
with Writer(path) as writer:
writer.add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
with pytest.raises(WriterError, match='can only be added once'):
writer.add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
path.unlink()
with Writer(path) as writer:
res1 = writer.add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
res2 = writer.add_connection(
'/foo',
'test_msgs/msg/Test',
'MESSAGE_DEFINITION',
'HASH',
callerid='src',
)
res3 = writer.add_connection(
'/foo',
'test_msgs/msg/Test',
'MESSAGE_DEFINITION',
'HASH',
latching=1,
)
assert (res1.id, res2.id, res3.id) == (0, 1, 2)
def test_write_errors(tmp_path: Path) -> None:
"""Test write errors."""
path = tmp_path / 'test.bag'
with pytest.raises(WriterError, match='not opened'):
Writer(path).write(Mock(), 42, b'DEADBEEF')
with Writer(path) as writer, \
pytest.raises(WriterError, match='is no connection'):
writer.write(Mock(), 42, b'DEADBEEF')
path.unlink()
def test_write_simple(tmp_path: Path) -> None:
"""Test writing of messages."""
path = tmp_path / 'test.bag'
with Writer(path) as writer:
conn_foo = writer.add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
conn_latching = writer.add_connection(
'/foo',
'test_msgs/msg/Test',
'MESSAGE_DEFINITION',
'HASH',
latching=1,
)
conn_bar = writer.add_connection(
'/bar',
'test_msgs/msg/Bar',
'OTHER_DEFINITION',
'HASH',
callerid='src',
)
writer.add_connection('/baz', 'test_msgs/msg/Baz', 'NEVER_WRITTEN', 'HASH')
writer.write(conn_foo, 42, b'DEADBEEF')
writer.write(conn_latching, 42, b'DEADBEEF')
writer.write(conn_bar, 43, b'SECRET')
writer.write(conn_bar, 43, b'SUBSEQUENT')
res = path.read_bytes()
assert res.count(b'op=\x05') == 1
assert res.count(b'op=\x06') == 1
assert res.count(b'MESSAGE_DEFINITION') == 4
assert res.count(b'latching=1') == 2
assert res.count(b'OTHER_DEFINITION') == 2
assert res.count(b'callerid=src') == 2
assert res.count(b'NEVER_WRITTEN') == 2
assert res.count(b'DEADBEEF') == 2
assert res.count(b'SECRET') == 1
assert res.count(b'SUBSEQUENT') == 1
path.unlink()
with Writer(path) as writer:
writer.chunk_threshold = 256
conn_foo = writer.add_connection('/foo', 'test_msgs/msg/Test', 'MESSAGE_DEFINITION', 'HASH')
conn_latching = writer.add_connection(
'/foo',
'test_msgs/msg/Test',
'MESSAGE_DEFINITION',
'HASH',
latching=1,
)
conn_bar = writer.add_connection(
'/bar',
'test_msgs/msg/Bar',
'OTHER_DEFINITION',
'HASH',
callerid='src',
)
writer.add_connection('/baz', 'test_msgs/msg/Baz', 'NEVER_WRITTEN', 'HASH')
writer.write(conn_foo, 42, b'DEADBEEF')
writer.write(conn_latching, 42, b'DEADBEEF')
writer.write(conn_bar, 43, b'SECRET')
writer.write(conn_bar, 43, b'SUBSEQUENT')
res = path.read_bytes()
assert res.count(b'op=\x05') == 2
assert res.count(b'op=\x06') == 2
assert res.count(b'MESSAGE_DEFINITION') == 4
assert res.count(b'latching=1') == 2
assert res.count(b'OTHER_DEFINITION') == 2
assert res.count(b'callerid=src') == 2
assert res.count(b'NEVER_WRITTEN') == 2
assert res.count(b'DEADBEEF') == 2
assert res.count(b'SECRET') == 1
assert res.count(b'SUBSEQUENT') == 1
path.unlink()
def test_compression_errors(tmp_path: Path) -> None:
"""Test compression modes."""
path = tmp_path / 'test.bag'
with Writer(path) as writer, \
pytest.raises(WriterError, match='already open'):
writer.set_compression(writer.CompressionFormat.BZ2)
@pytest.mark.parametrize('fmt', [None, Writer.CompressionFormat.BZ2, Writer.CompressionFormat.LZ4])
def test_compression_modes(tmp_path: Path, fmt: Optional[Writer.CompressionFormat]) -> None:
"""Test compression modes."""
path = tmp_path / 'test.bag'
writer = Writer(path)
if fmt:
writer.set_compression(fmt)
with writer:
conn = writer.add_connection('/foo', 'std_msgs/msg/Int8')
writer.write(conn, 42, b'\x42')
data = path.read_bytes()
assert data.count(f'compression={fmt.name.lower() if fmt else "none"}'.encode()) == 1

View File

@ -0,0 +1,13 @@
FROM ros:rolling
RUN apt-get update \
&& apt-get upgrade -y \
&& apt-get install -y \
python3-pip
RUN python3 -m pip install ruamel.yaml zstandard
COPY src/rosbags /opt/ros/rolling/lib/python3.8/site-packages/rosbags
COPY tools/bench/bench.py /
CMD ["/usr/bin/python3", "/bench.py", "/rosbag2"]

View File

@ -0,0 +1,11 @@
=====
Bench
=====
Check and benchmark ``rosbags.rosbag2`` agains ``rosbag2_py``. The provided Dockerfile creates an execution environment for the script. Run from the root of this repository::
$ docker build -t rosbags/bench -f tools/bench/Dockerfile .
The docker image expects that the rosbag2 file to benchmark is mounted under ``/rosbag2``::
$ docker run --rm -v /path/to/bag:/rosbag2 rosbags/bench

View File

@ -0,0 +1,154 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Check and benchmark rosbag2 read implementations."""
# pylint: disable=import-error
from __future__ import annotations
import sys
from math import isnan
from pathlib import Path
from timeit import timeit
from typing import TYPE_CHECKING
import numpy
from rclpy.serialization import deserialize_message # type: ignore
from rosbag2_py import ConverterOptions, SequentialReader, StorageOptions # type: ignore
from rosidl_runtime_py.utilities import get_message # type: ignore
from rosbags.rosbag2 import Reader
from rosbags.serde import deserialize_cdr
if TYPE_CHECKING:
from typing import Generator, Protocol
class NativeMSG(Protocol): # pylint: disable=too-few-public-methods
"""Minimal native ROS message interface used for benchmark."""
def get_fields_and_field_types(self) -> dict[str, str]:
"""Introspect message type."""
raise NotImplementedError
class ReaderPy: # pylint: disable=too-few-public-methods
"""Mimimal shim using rosbag2_py to emulate rosbag2 API."""
def __init__(self, path: Path):
"""Initialize reader shim."""
soptions = StorageOptions(str(path), 'sqlite3')
coptions = ConverterOptions('', '')
self.reader = SequentialReader()
self.reader.open(soptions, coptions)
self.typemap = {x.name: x.type for x in self.reader.get_all_topics_and_types()}
def messages(self) -> Generator[tuple[str, str, int, bytes], None, None]:
"""Expose rosbag2 like generator behavior."""
while self.reader.has_next():
topic, data, timestamp = self.reader.read_next()
yield topic, self.typemap[topic], timestamp, data
def deserialize_py(data: bytes, msgtype: str) -> NativeMSG:
"""Deserialization helper for rosidl_runtime_py + rclpy."""
pytype = get_message(msgtype)
return deserialize_message(data, pytype) # type: ignore
def compare_msg(lite: object, native: NativeMSG) -> None:
"""Compare rosbag2 (lite) vs rosbag2_py (native) message content.
Args:
lite: Message from rosbag2.
native: Message from rosbag2_py.
Raises:
AssertionError: If messages are not identical.
"""
for fieldname in native.get_fields_and_field_types().keys():
native_val = getattr(native, fieldname)
lite_val = getattr(lite, fieldname)
if hasattr(lite_val, '__dataclass_fields__'):
compare_msg(lite_val, native_val)
elif isinstance(lite_val, numpy.ndarray):
assert not (native_val != lite_val).any(), f'{fieldname}: {native_val} != {lite_val}'
elif isinstance(lite_val, list):
assert len(native_val) == len(lite_val), f'{fieldname} length mismatch'
for sub1, sub2 in zip(native_val, lite_val):
compare_msg(sub2, sub1)
elif isinstance(lite_val, float) and isnan(lite_val):
assert isnan(native_val)
else:
assert native_val == lite_val, f'{fieldname}: {native_val} != {lite_val}'
def compare(path: Path) -> None:
"""Compare raw and deserialized messages."""
with Reader(path) as reader:
gens = (reader.messages(), ReaderPy(path).messages())
for item, item_py in zip(*gens):
connection, timestamp, data = item
topic_py, msgtype_py, timestamp_py, data_py = item_py
assert connection.topic == topic_py
assert connection.msgtype == msgtype_py
assert timestamp == timestamp_py
assert data == data_py
msg_py = deserialize_py(data_py, msgtype_py)
msg = deserialize_cdr(data, connection.msgtype)
compare_msg(msg, msg_py)
assert not list(gens[0])
assert not list(gens[1])
def read_deser_rosbag2_py(path: Path) -> None:
"""Read testbag with rosbag2_py."""
soptions = StorageOptions(str(path), 'sqlite3')
coptions = ConverterOptions('', '')
reader = SequentialReader()
reader.open(soptions, coptions)
typemap = {x.name: x.type for x in reader.get_all_topics_and_types()}
while reader.has_next():
topic, rawdata, _ = reader.read_next()
msgtype = typemap[topic]
pytype = get_message(msgtype)
deserialize_message(rawdata, pytype)
def read_deser_rosbag2(path: Path) -> None:
"""Read testbag with rosbag2lite."""
with Reader(path) as reader:
for connection, _, data in reader.messages():
deserialize_cdr(data, connection.msgtype)
def main() -> None:
"""Benchmark rosbag2 against rosbag2_py."""
path = Path(sys.argv[1])
try:
print('Comparing messages from rosbag2 and rosbag2_py.') # noqa: T201
compare(path)
except AssertionError as err:
print(f'Comparison failed {err!r}') # noqa: T201
sys.exit(1)
print('Measuring execution times of rosbag2 and rosbag2_py.') # noqa: T201
time_py = timeit(lambda: read_deser_rosbag2_py(path), number=1)
time = timeit(lambda: read_deser_rosbag2(path), number=1)
print( # noqa: T201
f'Processing times:\n'
f'rosbag2_py {time_py:.3f}\n'
f'rosbag2 {time:.3f}\n'
f'speedup {time_py / time:.2f}\n',
)
if __name__ == '__main__':
main()

View File

@ -0,0 +1,11 @@
FROM ros:rolling
RUN apt-get update \
&& apt-get upgrade -y \
&& apt-get install -y \
python3-pip \
python3-rosbag
COPY tools/compare/compare.py /
CMD ["/usr/bin/python3", "/compare.py", "/rosbag1", "/rosbag2"]

View File

@ -0,0 +1,11 @@
=======
Compare
=======
Check if the contents of a ``rosbag1`` and another ``rosbag1`` or ``rosbag2`` file are identical. The provided Dockerfile creates an execution environment for the script. Run from the root of this repository::
$ docker build -t rosbags/compare -f tools/compare/Dockerfile .
The docker image expects that the first rosbag1 and second rosbag1 or rosbag2 files to be mounted at ``/rosbag1`` and ``/rosbag2`` respectively::
$ docker run --rm -v /path/to/rosbag1.bag:/rosbag1 -v /path/to/rosbag2:/rosbag2 rosbags/compare

View File

@ -0,0 +1,173 @@
# Copyright 2020-2023 Ternaris.
# SPDX-License-Identifier: Apache-2.0
"""Tool checking if contents of two rosbags are equal."""
# pylint: disable=import-error
from __future__ import annotations
import array
import math
import sys
from pathlib import Path
from typing import TYPE_CHECKING
from unittest.mock import Mock
import genpy # type: ignore
import numpy
import rosgraph_msgs.msg # type: ignore
from rclpy.serialization import deserialize_message # type: ignore
from rosbag2_py import ConverterOptions, SequentialReader, StorageOptions # type: ignore
from rosidl_runtime_py.utilities import get_message # type: ignore
rosgraph_msgs.msg.Log = Mock()
rosgraph_msgs.msg.TopicStatistics = Mock()
import rosbag.bag # type:ignore # noqa: E402 pylint: disable=wrong-import-position
if TYPE_CHECKING:
from typing import Generator, List, Protocol, Union, runtime_checkable
@runtime_checkable
class NativeMSG(Protocol): # pylint: disable=too-few-public-methods
"""Minimal native ROS message interface used for benchmark."""
def get_fields_and_field_types(self) -> dict[str, str]:
"""Introspect message type."""
raise NotImplementedError
class Reader: # pylint: disable=too-few-public-methods
"""Mimimal shim using rosbag2_py to emulate rosbags API."""
def __init__(self, path: Union[str, Path]):
"""Initialize reader shim."""
self.reader = SequentialReader()
self.reader.open(StorageOptions(path, 'sqlite3'), ConverterOptions('', ''))
self.typemap = {x.name: x.type for x in self.reader.get_all_topics_and_types()}
def messages(self) -> Generator[tuple[str, int, bytes], None, None]:
"""Expose rosbag2 like generator behavior."""
while self.reader.has_next():
topic, data, timestamp = self.reader.read_next()
pytype = get_message(self.typemap[topic])
yield topic, timestamp, deserialize_message(data, pytype)
def fixup_ros1(conns: List[rosbag.bag._Connection_Info]) -> None:
"""Monkeypatch ROS2 fieldnames onto ROS1 objects.
Args:
conns: Rosbag1 connections.
"""
genpy.Time.sec = property(lambda x: x.secs)
genpy.Time.nanosec = property(lambda x: x.nsecs)
genpy.Duration.sec = property(lambda x: x.secs)
genpy.Duration.nanosec = property(lambda x: x.nsecs)
if conn := next((x for x in conns if x.datatype == 'sensor_msgs/CameraInfo'), None):
print('Patching CameraInfo') # noqa: T201
cls = rosbag.bag._get_message_type(conn) # pylint: disable=protected-access
cls.d = property(lambda x: x.D, lambda x, y: setattr(x, 'D', y)) # noqa: B010
cls.k = property(lambda x: x.K, lambda x, y: setattr(x, 'K', y)) # noqa: B010
cls.r = property(lambda x: x.R, lambda x, y: setattr(x, 'R', y)) # noqa: B010
cls.p = property(lambda x: x.P, lambda x, y: setattr(x, 'P', y)) # noqa: B010
def compare(ref: object, msg: object) -> None:
"""Compare message to its reference.
Args:
ref: Reference ROS1 message.
msg: Converted ROS2 message.
"""
if isinstance(msg, NativeMSG):
for name in msg.get_fields_and_field_types():
refval = getattr(ref, name)
msgval = getattr(msg, name)
compare(refval, msgval)
elif isinstance(msg, array.array):
if isinstance(ref, bytes):
assert msg.tobytes() == ref
else:
assert isinstance(msg, numpy.ndarray)
assert (msg == ref).all()
elif isinstance(msg, list):
assert isinstance(ref, (list, numpy.ndarray))
assert len(msg) == len(ref)
for refitem, msgitem in zip(ref, msg):
compare(refitem, msgitem)
elif isinstance(msg, str):
assert msg == ref
elif isinstance(msg, float) and math.isnan(msg):
assert isinstance(ref, float)
assert math.isnan(ref)
else:
assert ref == msg
def main_bag1_bag1(path1: Path, path2: Path) -> None:
"""Compare rosbag1 to rosbag1 message by message.
Args:
path1: Rosbag1 filename.
path2: Rosbag1 filename.
"""
reader1 = rosbag.bag.Bag(path1)
reader2 = rosbag.bag.Bag(path2)
src1 = reader1.read_messages(raw=True, return_connection_header=True)
src2 = reader2.read_messages(raw=True, return_connection_header=True)
for msg1, msg2 in zip(src1, src2):
assert msg1.connection_header == msg2.connection_header
assert msg1.message[:-2] == msg2.message[:-2]
assert msg1.timestamp == msg2.timestamp
assert msg1.topic == msg2.topic
assert next(src1, None) is None
assert next(src2, None) is None
print('Bags are identical.') # noqa: T201
def main_bag1_bag2(path1: Path, path2: Path) -> None:
"""Compare rosbag1 to rosbag2 message by message.
Args:
path1: Rosbag1 filename.
path2: Rosbag2 filename.
"""
reader1 = rosbag.bag.Bag(path1)
src1 = reader1.read_messages()
src2 = Reader(path2).messages()
fixup_ros1(reader1._connections.values()) # pylint: disable=protected-access
for msg1, msg2 in zip(src1, src2):
assert msg1.topic == msg2[0]
assert msg1.timestamp.to_nsec() == msg2[1]
compare(msg1.message, msg2[2])
assert next(src1, None) is None
assert next(src2, None) is None
print('Bags are identical.') # noqa: T201
if __name__ == '__main__':
if len(sys.argv) != 3:
print(f'Usage: {sys.argv} [rosbag1] [rosbag2]') # noqa: T201
sys.exit(1)
arg1 = Path(sys.argv[1])
arg2 = Path(sys.argv[2])
main = main_bag1_bag2 if arg2.is_dir() else main_bag1_bag1
main(arg1, arg2)

View File