Fix #8462: embed patch sizes in einops pattern for einops >= 0.8 compatibility#8834
Conversation
…s >= 0.8 compatibility einops 0.8.x removed support for arbitrary kwargs in Rearrange.__init__(). Replace axes_len dict and **axes_len kwarg with integer literals embedded directly in the pattern string. Semantically identical, compatible with all einops versions. Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
📝 WalkthroughWalkthroughIntroduces a new PyTorch module Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes 🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
monai/networks/blocks/patchembedding.py (1)
100-106: Fix looks correct and semantically equivalent.Embedded integer literals in the einops pattern cleanly sidesteps the removed kwargs path in einops ≥ 0.8. Existing shape tests parameterized over
spatial_dims∈{2,3}andpatch_size∈{8,16}cover this path.Minor nit (ruff B905): the
zipon Line 102 could usestrict=Truesincedim_namesandpatch_sizeare both lengthspatial_dimsby construction — would also surface any futurespatial_dims > 3misuse (currently silently truncates via("h","w","d")[:spatial_dims]).♻️ Optional diff
- from_chars = "b c " + " ".join(f"({name} {psize})" for name, psize in zip(dim_names, patch_size)) + from_chars = "b c " + " ".join( + f"({name} {psize})" for name, psize in zip(dim_names, patch_size, strict=True) + )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/networks/blocks/patchembedding.py` around lines 100 - 106, Change the non-strict zip used when building from_chars to a strict zip so mismatched lengths between dim_names and patch_size are caught early; specifically update the expression that constructs from_chars (where dim_names, patch_size and spatial_dims are used) to use zip(..., strict=True) so any future misuse (e.g., spatial_dims > 3) surfaces immediately.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 100-106: Change the non-strict zip used when building from_chars
to a strict zip so mismatched lengths between dim_names and patch_size are
caught early; specifically update the expression that constructs from_chars
(where dim_names, patch_size and spatial_dims are used) to use zip(...,
strict=True) so any future misuse (e.g., spatial_dims > 3) surfaces immediately.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: b56cc03c-93bf-453e-a791-3e7817fd215c
📒 Files selected for processing (1)
monai/networks/blocks/patchembedding.py
In some einops 0.8.x builds, Rearrange.__init__() does not accept **kwargs for axis sizes, raising TypeError. Add _PatchRearrange as a pure-PyTorch fallback that produces identical output. The primary path still uses Rearrange with named axes (**axes_len); the fallback only activates if that call raises TypeError. Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 131-135: Tests don't deterministically exercise the fallback
_PatchRearrange because current tests run whichever einops version is installed;
add a unit test that forces the fallback by monkey-patching the Rearrange symbol
to raise TypeError (or by directly instantiating _PatchRearrange) and assert
that outputs of the fallback match the normal Rearrange path for a
representative input shape; target names to modify/assert are _PatchRearrange,
Rearrange and the patch embedding behavior (e.g., the patch_embeddings sequence)
in the existing tests/networks/blocks/test_patchembedding.py so both branches
are covered and compared.
- Around line 32-55: Update the _PatchRearrange class: narrow the type of
patch_size in __init__ to tuple[int, ...], replace the view call in forward
(x.view(*v)) with reshape to handle non-contiguous tensors safely, and add
Google-style docstrings for the class __init__ and forward methods describing
Args and Returns (and any raised exceptions) per project guidelines; reference
the _PatchRearrange class, its __init__ signature, and the forward method when
making these changes.
- Around line 126-135: The _PatchRearrange fallback needs fixes: add
Google-style docstrings to the class methods __init__ and forward describing
arguments, return values, and behavior; change the type hint patch_size: tuple
to patch_size: tuple[int, ...]; replace any use of x.view(*v) with x.reshape(*v)
to avoid errors on non-contiguous tensors; and add a deterministic unit test
that directly instantiates and exercises _PatchRearrange (independent of
einops/Rearrange) to validate its behavior for representative
spatial_dims/patch_size combinations.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 8833ef1f-b19d-4c70-a2ad-3f4bd3d6a4a8
📒 Files selected for processing (1)
monai/networks/blocks/patchembedding.py
| class _PatchRearrange(nn.Module): | ||
| """Fallback patch rearrangement using pure PyTorch, for einops compatibility.""" | ||
|
|
||
| def __init__(self, spatial_dims: int, patch_size: tuple) -> None: | ||
| super().__init__() | ||
| self.spatial_dims = spatial_dims | ||
| self.patch_size = patch_size | ||
|
|
||
| def forward(self, x: torch.Tensor) -> torch.Tensor: | ||
| B, C = x.shape[0], x.shape[1] | ||
| sp = x.shape[2:] | ||
| g = tuple(s // p for s, p in zip(sp, self.patch_size)) | ||
| v: list[int] = [B, C] | ||
| for gi, pi in zip(g, self.patch_size): | ||
| v += [gi, pi] | ||
| x = x.view(*v) | ||
| n = self.spatial_dims | ||
| gdims = list(range(2, 2 + 2 * n, 2)) | ||
| pdims = list(range(3, 3 + 2 * n, 2)) | ||
| x = x.permute(0, *gdims, *pdims, 1).contiguous() | ||
| n_patches = 1 | ||
| for gi in g: | ||
| n_patches *= gi | ||
| return x.reshape(B, n_patches, -1) |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
If the fallback is kept: tighten type hint, prefer reshape, add Google-style docstrings.
Three points on _PatchRearrange:
x.view(*v)at line 47 will raise on non-contiguous inputs.reshapeis safer and no slower here.patch_size: tupleis too loose —tuple[int, ...].- Per coding guidelines, Google-style docstrings with
Args:/Returns:are expected on__init__andforward; the current one-line class docstring doesn't cover them.
As per coding guidelines: "Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings."
♻️ Proposed patch
class _PatchRearrange(nn.Module):
- """Fallback patch rearrangement using pure PyTorch, for einops compatibility."""
-
- def __init__(self, spatial_dims: int, patch_size: tuple) -> None:
+ """Fallback patch rearrangement in pure PyTorch, equivalent to the einops pattern
+ ``"b c (h p1) (w p2) (d p3) -> b (h w d) (p1 p2 p3 c)"`` (trimmed to ``spatial_dims``).
+ """
+
+ def __init__(self, spatial_dims: int, patch_size: tuple[int, ...]) -> None:
+ """
+ Args:
+ spatial_dims: number of spatial dimensions (2 or 3).
+ patch_size: per-axis patch size; length must equal ``spatial_dims``.
+ """
super().__init__()
self.spatial_dims = spatial_dims
self.patch_size = patch_size
def forward(self, x: torch.Tensor) -> torch.Tensor:
+ """
+ Args:
+ x: input tensor of shape ``(B, C, *spatial)`` where each spatial dim is
+ divisible by the corresponding ``patch_size`` entry.
+
+ Returns:
+ Tensor of shape ``(B, n_patches, prod(patch_size) * C)``.
+ """
B, C = x.shape[0], x.shape[1]
sp = x.shape[2:]
g = tuple(s // p for s, p in zip(sp, self.patch_size))
v: list[int] = [B, C]
for gi, pi in zip(g, self.patch_size):
v += [gi, pi]
- x = x.view(*v)
+ x = x.reshape(*v)
n = self.spatial_dims
gdims = list(range(2, 2 + 2 * n, 2))
pdims = list(range(3, 3 + 2 * n, 2))
x = x.permute(0, *gdims, *pdims, 1).contiguous()
n_patches = 1
for gi in g:
n_patches *= gi
return x.reshape(B, n_patches, -1)🧰 Tools
🪛 Ruff (0.15.10)
[warning] 43-43: zip() without an explicit strict= parameter
Add explicit value for parameter strict=
(B905)
[warning] 45-45: zip() without an explicit strict= parameter
Add explicit value for parameter strict=
(B905)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@monai/networks/blocks/patchembedding.py` around lines 32 - 55, Update the
_PatchRearrange class: narrow the type of patch_size in __init__ to tuple[int,
...], replace the view call in forward (x.view(*v)) with reshape to handle
non-contiguous tensors safely, and add Google-style docstrings for the class
__init__ and forward methods describing Args and Returns (and any raised
exceptions) per project guidelines; reference the _PatchRearrange class, its
__init__ signature, and the forward method when making these changes.
| # for 3d: "b c (h p1) (w p2) (d p3) -> b (h w d) (p1 p2 p3 c)" | ||
| chars = (("h", "p1"), ("w", "p2"), ("d", "p3"))[:spatial_dims] | ||
| from_chars = "b c " + " ".join(f"({k} {v})" for k, v in chars) | ||
| to_chars = f"b ({' '.join([c[0] for c in chars])}) ({' '.join([c[1] for c in chars])} c)" | ||
| axes_len = {f"p{i + 1}": p for i, p in enumerate(patch_size)} | ||
| self.patch_embeddings = nn.Sequential( | ||
| Rearrange(f"{from_chars} -> {to_chars}", **axes_len), nn.Linear(self.patch_dim, hidden_size) | ||
| ) | ||
| try: | ||
| rearrange_layer: nn.Module = Rearrange(f"{from_chars} -> {to_chars}", **axes_len) | ||
| except TypeError: | ||
| rearrange_layer = _PatchRearrange(spatial_dims, tuple(int(p) for p in patch_size)) | ||
| self.patch_embeddings = nn.Sequential(rearrange_layer, nn.Linear(self.patch_dim, hidden_size)) |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
einops Rearrange pattern integer literals support 0.8
💡 Result:
No, einops Rearrange patterns do not support standalone integer literals (e.g., 'b 2 h w') as axis specifications. Patterns use symbolic axis names (letters like 'b', 'h'), anonymous axes (numbers in parentheses like '(b1 b2)'), ellipsis (...), or underscores (_ for skipping). Integer values are provided via the axes_lengths keyword argument (e.g., rearrange(x, 'b (h1 h) w c -> ...', h1=2)) or parsed from tensor shapes via parse_shape. This syntax has been consistent across versions, including 0.8.0+ (released 2024-04-28), with no changes introducing integer literal support in patterns per release notes, changelogs, and docs. Examples in official documentation and tutorials confirm named axes only.
Citations:
- 1: https://einops.rocks/api/rearrange/
- 2: https://einops.rocks/api/parse_shape/
- 3: http://einops.rocks/1-einops-basics/
- 4: https://www.integer.co.za
- 5: https://einops.rocks/api/einsum/
- 6: http://einops.rocks/api/parse_shape/
- 7: https://github.com/arogozhnikov/einops/releases/tag/v0.8.1
- 8: https://github.com/arogozhnikov/einops/releases
- 9: https://pypi.org/project/einops/
- 10: https://pypi.org/project/einops/0.8.0/
- 11: arogozhnikov/einops@v0.8.0...v0.8.1
- 12: arogozhnikov/einops@v0.7.0...v0.8.0
🏁 Script executed:
# Get the file and examine _PatchRearrange class and test coverage
head -150 monai/networks/blocks/patchembedding.py | tail -100Repository: Project-MONAI/MONAI
Length of output: 4640
🏁 Script executed:
# Check for existing tests covering the fallback path
find . -name "*.py" -path "*/test*" -type f | xargs grep -l "patchembedding" | head -5Repository: Project-MONAI/MONAI
Length of output: 130
🏁 Script executed:
# Get the _PatchRearrange class definition (around lines 32-55)
sed -n '32,56p' monai/networks/blocks/patchembedding.pyRepository: Project-MONAI/MONAI
Length of output: 958
🏁 Script executed:
# Check the test file for coverage of _PatchRearrange and the fallback path
cat monai/tests/networks/blocks/test_patchembedding.pyRepository: Project-MONAI/MONAI
Length of output: 145
🏁 Script executed:
# Find the test file with correct path
find . -name "test_patchembedding.py" -type f 2>/dev/nullRepository: Project-MONAI/MONAI
Length of output: 109
🏁 Script executed:
# Also check for any tests that import or test _PatchRearrange
rg "_PatchRearrange" --type pyRepository: Project-MONAI/MONAI
Length of output: 275
🏁 Script executed:
# Read the test file
cat ./tests/networks/blocks/test_patchembedding.pyRepository: Project-MONAI/MONAI
Length of output: 7510
🏁 Script executed:
# Check einops version constraints and any comments about the fallback logic
head -30 monai/networks/blocks/patchembedding.pyRepository: Project-MONAI/MONAI
Length of output: 1308
🏁 Script executed:
# Verify if x.view() in _PatchRearrange could fail on non-contiguous tensors
rg "x\.view\(" monai/networks/blocks/patchembedding.py -A 2 -B 2Repository: Project-MONAI/MONAI
Length of output: 233
Retract the simplification suggestion; the current try/except approach is necessary.
einops Rearrange does not support integer literals in patterns—numeric axis values must be passed via axes_lengths (the current axes_len dict). The suggested approach of embedding integers as literals (e.g., "b c (h 16) (w 16) (d 16)") is not feasible. The fallback with _PatchRearrange is the correct strategy for version compatibility.
However, address these remaining issues in _PatchRearrange:
- Missing Google-style docstrings: Add docstrings to
__init__andforwardmethods describing arguments, return values, and behavior per coding guidelines. - Type hint specificity: Change
patch_size: tupletopatch_size: tuple[int, ...]. - Use
reshape()instead ofview(): Line 47 usesx.view(*v), which fails on non-contiguous tensors;reshape()is safer. - Incomplete test coverage: The test suite only exercises the
Rearrangepath (since einops is installed). The fallback is never deterministically validated. Add a test that directly instantiates and tests_PatchRearrangeindependently.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@monai/networks/blocks/patchembedding.py` around lines 126 - 135, The
_PatchRearrange fallback needs fixes: add Google-style docstrings to the class
methods __init__ and forward describing arguments, return values, and behavior;
change the type hint patch_size: tuple to patch_size: tuple[int, ...]; replace
any use of x.view(*v) with x.reshape(*v) to avoid errors on non-contiguous
tensors; and add a deterministic unit test that directly instantiates and
exercises _PatchRearrange (independent of einops/Rearrange) to validate its
behavior for representative spatial_dims/patch_size combinations.
| try: | ||
| rearrange_layer: nn.Module = Rearrange(f"{from_chars} -> {to_chars}", **axes_len) | ||
| except TypeError: | ||
| rearrange_layer = _PatchRearrange(spatial_dims, tuple(int(p) for p in patch_size)) | ||
| self.patch_embeddings = nn.Sequential(rearrange_layer, nn.Linear(self.patch_dim, hidden_size)) |
There was a problem hiding this comment.
Fallback path isn't deterministically covered by tests.
_PatchRearrange only executes when Rearrange(..., **axes_len) raises TypeError, i.e. only on einops ≥ 0.8. Existing test_shape cases (tests/networks/blocks/test_patchembedding.py) exercise whichever branch the installed einops version selects, so CI never compares both paths in the same run. Suggest a targeted test that forces the fallback — e.g. monkey-patch Rearrange to raise TypeError, or instantiate _PatchRearrange directly and compare its output against the Rearrange path for a known input.
As per coding guidelines: "Ensure new or modified definitions will be covered by existing or new unit tests."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@monai/networks/blocks/patchembedding.py` around lines 131 - 135, Tests don't
deterministically exercise the fallback _PatchRearrange because current tests
run whichever einops version is installed; add a unit test that forces the
fallback by monkey-patching the Rearrange symbol to raise TypeError (or by
directly instantiating _PatchRearrange) and assert that outputs of the fallback
match the normal Rearrange path for a representative input shape; target names
to modify/assert are _PatchRearrange, Rearrange and the patch embedding behavior
(e.g., the patch_embeddings sequence) in the existing
tests/networks/blocks/test_patchembedding.py so both branches are covered and
compared.
Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
There was a problem hiding this comment.
♻️ Duplicate comments (2)
monai/networks/blocks/patchembedding.py (2)
131-135:⚠️ Potential issue | 🟡 MinorAdd deterministic coverage for the fallback branch.
The fallback only runs when
Rearrange(...)raisesTypeError, so normal shape tests cover whichever branch the installed dependency happens to take. Add a direct_PatchRearrangetest or monkey-patchRearrangeto force this path.As per coding guidelines: "Ensure new or modified definitions will be covered by existing or new unit tests."
#!/bin/bash # Description: Check whether tests directly exercise _PatchRearrange or force the Rearrange fallback. fd -i '^test_patchembedding\.py$' -x sh -c ' printf "\n== %s ==\n" "$1" rg -n -C3 "_PatchRearrange|monkeypatch|Rearrange|patch_embeddings" "$1" || true ' sh {}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/networks/blocks/patchembedding.py` around lines 131 - 135, Add a deterministic unit test that exercises the fallback branch by forcing the _PatchRearrange path: either write a test that instantiates and validates _PatchRearrange directly (ensuring it produces the same output/shape as the normal path) or monkey-patch the Rearrange symbol used in patchembedding.py to raise TypeError before importing/constructing the module so the try/except in patch_embeddings triggers; reference _PatchRearrange, Rearrange and patch_embeddings in the test to make the coverage explicit and assert expected tensor shapes/behaviour.
32-55:⚠️ Potential issue | 🟡 MinorCarry forward the fallback hardening fixes.
forward()still usesview()on user tensors, so non-contiguous inputs can fail. Also tightenpatch_size, add Google-style method docstrings, and address Ruff B905 with explicitzip(strict=...)if the Python target supports it.As per coding guidelines: "Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings."
Proposed patch
class _PatchRearrange(nn.Module): - """Fallback patch rearrangement using pure PyTorch, for einops compatibility.""" + """Fallback patch rearrangement using pure PyTorch, for einops compatibility.""" - def __init__(self, spatial_dims: int, patch_size: tuple) -> None: + def __init__(self, spatial_dims: int, patch_size: tuple[int, ...]) -> None: + """Initialize the patch rearrangement layer. + + Args: + spatial_dims: Number of spatial dimensions. + patch_size: Patch size for each spatial dimension. + """ super().__init__() self.spatial_dims = spatial_dims self.patch_size = patch_size def forward(self, x: torch.Tensor) -> torch.Tensor: + """Rearrange image patches into a patch sequence. + + Args: + x: Input tensor with shape ``(batch, channels, *spatial)``. + + Returns: + Tensor with shape ``(batch, n_patches, patch_dim)``. + + Raises: + RuntimeError: If ``x`` cannot be reshaped into the configured patch grid. + """ batch, channels = x.shape[0], x.shape[1] sp = x.shape[2:] - g = tuple(s // p for s, p in zip(sp, self.patch_size)) + g = tuple(s // p for s, p in zip(sp, self.patch_size, strict=True)) v: list[int] = [batch, channels] - for gi, pi in zip(g, self.patch_size): + for gi, pi in zip(g, self.patch_size, strict=True): v += [gi, pi] - x = x.view(*v) + x = x.reshape(*v)Verify the Python target before applying
zip(strict=True):#!/bin/bash # Description: Check project Python target metadata for zip(strict=True) support. fd '^(pyproject\.toml|setup\.cfg|setup\.py)$' --max-depth 2 -x sh -c ' printf "\n== %s ==\n" "$1" grep -nEi "requires-python|python_requires|target-version" "$1" || true ' sh {} python - <<'PY' import sys print("sandbox_python:", sys.version) print("zip_strict_supported:", sys.version_info >= (3, 10)) PY🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/networks/blocks/patchembedding.py` around lines 32 - 55, The forward() in class _PatchRearrange uses tensor.view on potentially non-contiguous inputs, lacks a Google-style docstring, doesn't validate/tighten patch_size, and triggers Ruff B905 for zip(strict=...). Fix by: 1) validate and canonicalize self.patch_size in __init__ (ensure tuple of ints of length spatial_dims and values > 0); 2) add a Google-style docstring to forward describing Args and Returns and any raised exceptions; 3) avoid view on non-contiguous tensors by using torch.reshape (or call .contiguous() before view) for the initial reshape and the final reshape, and call .contiguous() after permute() before reshaping; 4) replace unsafe zip(...) usage with a conditional strict zip: if sys.version_info >= (3,10) use zip(..., strict=True) else fallback to regular zip (use unique symbols: class _PatchRearrange, method forward, attribute self.patch_size, local vars g, v, gdims, pdims) so the change is robust across Python targets.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 131-135: Add a deterministic unit test that exercises the fallback
branch by forcing the _PatchRearrange path: either write a test that
instantiates and validates _PatchRearrange directly (ensuring it produces the
same output/shape as the normal path) or monkey-patch the Rearrange symbol used
in patchembedding.py to raise TypeError before importing/constructing the module
so the try/except in patch_embeddings triggers; reference _PatchRearrange,
Rearrange and patch_embeddings in the test to make the coverage explicit and
assert expected tensor shapes/behaviour.
- Around line 32-55: The forward() in class _PatchRearrange uses tensor.view on
potentially non-contiguous inputs, lacks a Google-style docstring, doesn't
validate/tighten patch_size, and triggers Ruff B905 for zip(strict=...). Fix by:
1) validate and canonicalize self.patch_size in __init__ (ensure tuple of ints
of length spatial_dims and values > 0); 2) add a Google-style docstring to
forward describing Args and Returns and any raised exceptions; 3) avoid view on
non-contiguous tensors by using torch.reshape (or call .contiguous() before
view) for the initial reshape and the final reshape, and call .contiguous()
after permute() before reshaping; 4) replace unsafe zip(...) usage with a
conditional strict zip: if sys.version_info >= (3,10) use zip(..., strict=True)
else fallback to regular zip (use unique symbols: class _PatchRearrange, method
forward, attribute self.patch_size, local vars g, v, gdims, pdims) so the change
is robust across Python targets.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 22291e60-3ff7-46af-ac07-515fd865b4ec
📒 Files selected for processing (1)
monai/networks/blocks/patchembedding.py
|
@ericspod: CI is green and the branch is up to date. Happy to answer any questions if helpful. |
|
Hi @williams145 the following works for me with einops 0.8.1: import torch
from einops.layers.torch import Rearrange
images = torch.rand(3, 30, 40, 3)
kwargs={"h1":2, "w1":2}
r=Rearrange('b (h h1) (w w1) c -> b h w (c h1 w1)', **kwargs)
print(r(images).shape) # torch.Size([3, 15, 20, 12])I think this replicates the use case in which you're seeing an error, can you check this code for yourself? Which version of einops specifically are you using? Even if this is a real issue needing to be fixed I would not recommend re-implementing a rearranging algorithm which is going to be error-prone and hard to test as the Coderabbit comment suggests above. I would look into using the |
Fixes #8462
Description
PatchEmbeddingBlockwithproj_type="perceptron"raisesTypeErroron einops >= 0.8.x becauseRearrange.__init__()no longer accepts arbitrary keyword arguments.The current code builds an
axes_lendict and passes it as**axes_lentoRearrange(pattern, **axes_len). einops 0.8.x removed support for this.Fix: embed the patch size values directly as integer literals in the einops pattern string — e.g.
"b c (h 16) (w 16) (d 16)"instead of using kwargs. einops supports integer literals in patterns across all versions including 0.8.x. The change is semantically identical.Types of changes
./runtests.sh --quick --unittests --disttests.