Skip to content

Instantly share code, notes, and snippets.

@maggiemoss
Created October 3, 2025 21:49
Show Gist options
  • Select an option

  • Save maggiemoss/356645cf8cfe33123d9a27f23b30f7b1 to your computer and use it in GitHub Desktop.

Select an option

Save maggiemoss/356645cf8cfe33123d9a27f23b30f7b1 to your computer and use it in GitHub Desktop.
ERROR `Iterable[Mapping[str, Unknown] | Node | OpOverload[Ellipsis, Any] | Sequence[Unknown] | SymBool | SymFloat | SymInt | Tensor | bool | complex | device | dtype | float | int | layout | memory_format | range | slice[Any, Any, Any] | str | tuple[Unknown, ...] | None] | list[Unknown]` is not assignable to `list[Unknown]` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_export/converter.py:1100:17
|
1100 | / for i, outp in enumerate(node.outputs()):
1101 | | output_name = outp.debugName()
1102 | | self.name_to_node[output_name] = self.fx_graph.call_function(
1103 | | operator.getitem,
1104 | | (
1105 | | loop_node,
| |_______________________________________^
|
ERROR `Iterable[Mapping[str, Unknown] | Node | OpOverload[Ellipsis, Any] | Sequence[Unknown] | SymBool | SymFloat | SymInt | Tensor | bool | complex | device | dtype | float | int | layout | memory_format | range | slice[Any, Any, Any] | str | tuple[Unknown, ...] | None] | list[Unknown]` is not assignable to `list[Unknown]` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_export/converter.py:1112:13
|
1112 | / for i, name in enumerate(
1113 | | subgraph_converter.name_update_from_subblock_to_parent
1114 | | ):
1115 | | self.name_to_node[name] = self.fx_graph.call_function(
1116 | | operator.getitem,
1117 | | (
| |______________________^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `example_args` with type `tuple[Any, ...]` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `description` with type `str` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `model` with type `Module` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `name` with type `str` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `example_kwargs` with type `dict[str, Any]` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `extra_args` with type `tuple[Any, ...] | None` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `tags` with type `set[str]` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `support_level` with type `SupportLevel` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Unpacked keyword argument `Module | str | Unknown` is not assignable to parameter `dynamic_shapes` with type `dict[str, Any] | None` in function `ExportCase.__init__` [bad-argument-type]
--> torch/_export/db/case.py:135:23
|
135 | return ExportCase(**{**configs, "model": m, "name": name})
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Class member `MyAutogradFunction.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_export/db/examples/autograd_function.py:6:9
|
6 | def forward(ctx, x):
| ^^^^^^^
|
`MyAutogradFunction.forward` has type `(ctx: Unknown, x: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Class member `MyAutogradFunction.backward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_export/db/examples/autograd_function.py:10:9
|
10 | def backward(ctx, grad_output):
| ^^^^^^^^
|
`MyAutogradFunction.backward` has type `(ctx: Unknown, grad_output: Unknown) -> Unknown`, which is not consistent with `(ctx: Any, *grad_outputs: Any) -> Any` in `Function.backward` (the type of read-write attributes cannot be changed)
ERROR Cannot index into `dict[type[TorchRuntimeError] | type[Unsupported] | type[UserError], str | None]` [index-error]
--> torch/_export/db/logging.py:42:33
|
42 | attr_name = _ALLOW_LIST[type(e)]
| ^^^^^^^
|
Argument `type[Exception]` is not assignable to parameter `key` with type `type[TorchRuntimeError] | type[Unsupported] | type[UserError]` in function `dict.__getitem__`
ERROR Returned type `tuple[Source, list[KeyEntry] | tuple[KeyEntry, ...]]` is not assignable to declared return type `tuple[Source, tuple[KeyEntry, ...]]` [bad-return]
--> torch/_export/non_strict_utils.py:104:16
|
104 | return node, kp
| ^^^^^^^^
|
ERROR `GetAttrKey | KeyEntry | MappingKey[Unknown, Unknown] | SequenceKey[Unknown]` is not assignable to `KeyEntry` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_export/non_strict_utils.py:142:5
|
142 | / for k in kp:
143 | | if isinstance(k, SequenceKey):
144 | | source = GetItemSource(source, k.idx)
145 | | elif isinstance(k, MappingKey):
146 | | source = GetItemSource(source, k.key)
147 | | elif isinstance(k, GetAttrKey):
| |________________________________________^
|
ERROR `partial[SupportsRichComparisonT]` is not assignable to attribute `max` with type `Overload[[SupportsRichComparisonT](arg1: SupportsRichComparisonT, arg2: SupportsRichComparisonT, /, *_args: SupportsRichComparisonT, *, key: None = None) -> SupportsRichComparisonT, [_T](arg1: _T, arg2: _T, /, *_args: _T, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T, [SupportsRichComparisonT](iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None) -> SupportsRichComparisonT, [_T](iterable: Iterable[_T], /, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T, [SupportsRichComparisonT, _T](iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None, default: _T) -> SupportsRichComparisonT | _T, [_T1, _T2](iterable: Iterable[_T1], /, *, key: (_T1) -> SupportsDunderGT[Any] | SupportsDunderLT[Any], default: _T2) -> _T1 | _T2]` [bad-assignment]
--> torch/_export/non_strict_utils.py:357:20
|
357 | builtins.max = functools.partial(
| ____________________^
358 | | _tensor_min_max, real_callable=original_max, tensor_callable=torch.maximum
359 | | )
| |_____^
|
ERROR `partial[SupportsRichComparisonT]` is not assignable to attribute `min` with type `Overload[[SupportsRichComparisonT](arg1: SupportsRichComparisonT, arg2: SupportsRichComparisonT, /, *_args: SupportsRichComparisonT, *, key: None = None) -> SupportsRichComparisonT, [_T](arg1: _T, arg2: _T, /, *_args: _T, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T, [SupportsRichComparisonT](iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None) -> SupportsRichComparisonT, [_T](iterable: Iterable[_T], /, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T, [SupportsRichComparisonT, _T](iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None, default: _T) -> SupportsRichComparisonT | _T, [_T1, _T2](iterable: Iterable[_T1], /, *, key: (_T1) -> SupportsDunderGT[Any] | SupportsDunderLT[Any], default: _T2) -> _T1 | _T2]` [bad-assignment]
--> torch/_export/non_strict_utils.py:361:20
|
361 | builtins.min = functools.partial(
| ____________________^
362 | | _tensor_min_max, real_callable=original_min, tensor_callable=torch.minimum
363 | | )
| |_____^
|
ERROR Index 0 out of range for tuple with 0 elements [index-error]
--> torch/_export/non_strict_utils.py:1086:25
|
1086 | t = args[0]
| ^^^^^^^
|
ERROR Class member `ExportInterpreter.placeholder` overrides parent class `Interpreter` in an inconsistent manner [bad-override]
--> torch/_export/pass_base.py:191:13
|
191 | def placeholder(
| ^^^^^^^^^^^
|
`ExportInterpreter.placeholder` has type `BoundMethod[_ExportPassBaseDeprecatedDoNotUse.ExportInterpreter, (self: _ExportPassBaseDeprecatedDoNotUse.ExportInterpreter, target: str, args: tuple[Any, ...], kwargs: dict[str, Any]) -> ProxyValue[Unknown]]`, which is not assignable to `BoundMethod[_ExportPassBaseDeprecatedDoNotUse.ExportInterpreter, (self: _ExportPassBaseDeprecatedDoNotUse.ExportInterpreter, target: ((...) -> Any) | str, args: tuple[Mapping[str, Unknown] | Node | OpOverload[Ellipsis, Any] | Sequence[Unknown] | SymBool | SymFloat | SymInt | Tensor | bool | complex | device | dtype | float | int | layout | memory_format | range | slice[Any, Any, Any] | str | tuple[Unknown, ...] | None, ...], kwargs: dict[str, Any]) -> Any]`, the type of `Interpreter.placeholder`
ERROR `Interpreter` is not assignable to attribute `interpreter` with type `PropagateUnbackedSymInts` [bad-assignment]
--> torch/_export/pass_base.py:442:27
|
442 | prev_interpreter, self.interpreter = (
| ^^^^^^^^^^^^^^^^
|
ERROR `FakeTensorMode | nullcontext[None]` is not assignable to variable `fake_mode` with type `FakeTensorMode | None` [bad-assignment]
--> torch/_export/passes/_node_metadata_hook.py:35:17
|
35 | fake_mode = fake_mode or contextlib.nullcontext()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Cannot use `None` as a context manager [bad-context-manager]
--> torch/_export/passes/_node_metadata_hook.py:50:14
|
50 | with fake_mode, enable_python_dispatcher():
| ^^^^^^^^^
|
Object of class `NoneType` has no attribute `__enter__`
ERROR Cannot use `None` as a context manager [bad-context-manager]
--> torch/_export/passes/_node_metadata_hook.py:50:14
|
50 | with fake_mode, enable_python_dispatcher():
| ^^^^^^^^^
|
Object of class `NoneType` has no attribute `__exit__`
ERROR Object of class `str` has no attribute `__name__` [missing-attribute]
--> torch/_export/passes/_node_metadata_hook.py:84:16
|
84 | f"{node.target.__name__}_0",
| ^^^^^^^^^^^^^^^^^^^^
|
ERROR Object of class `str` has no attribute `__name__` [missing-attribute]
--> torch/_export/passes/_node_metadata_hook.py:85:49
|
85 | f"{node.target.__class__.__name__}.{node.target.__name__}",
| ^^^^^^^^^^^^^^^^^^^^
|
ERROR `Node | Unknown | None` is not assignable to `None` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_export/passes/replace_quantized_ops_with_standard_ops_pass.py:570:5
|
570 | / for node in gm.graph.nodes:
571 | | if isinstance(node.target, OpOverload):
572 | | with gm.graph.inserting_before(node):
573 | | namespace, opname = node.target.namespace, node.target._opname
574 | | if namespace == "quantized" and opname not in [
575 | | "conv_prepack",
| |____________________________________^
|
ERROR Object of class `ScriptObject` has no attribute `items` [missing-attribute]
--> torch/_export/passes/replace_quantized_ops_with_standard_ops_pass.py:632:44
|
632 | for b_name, b_value in v.items():
| ^^^^^^^
|
ERROR Object of class `ScriptObject` has no attribute `pop` [missing-attribute]
--> torch/_export/passes/replace_quantized_ops_with_standard_ops_pass.py:639:25
|
639 | v.pop(b_name, None)
| ^^^^^
|
ERROR Instance-only attribute `__name__` of class `HigherOrderOperator` is not visible on the class [missing-attribute]
--> torch/_export/passes/replace_with_hop_pass_util.py:38:16
|
38 | f"{wrap_hoo.__class__.__name__}.{wrap_hoo.__name__}",
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `dict[str, dict[str, int | list[str] | None]]` is not assignable to parameter `dims` with type `dict[str, RootDim]` in function `DynamicShapesSpec.__init__` [bad-argument-type]
--> torch/_export/serde/dynamic_shapes.py:57:66
|
57 | spec = DynamicShapesSpec(dynamic_shapes=dynamic_shapes, dims=dims)
| ^^^^
|
ERROR `tuple[str, ...]` is not assignable to variable `dynamic_shapes` with type `dict[str, Any] | list[Any] | tuple[Any] | None` [bad-assignment]
--> torch/_export/serde/dynamic_shapes.py:186:22
|
186 | dynamic_shapes = tuple(dynamic_shapes)
| ^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `dict[str, Any] | list[Any] | tuple[Any] | None` is not assignable to parameter `iterable` with type `Iterable[str]` in function `tuple.__new__` [bad-argument-type]
--> torch/_export/serde/dynamic_shapes.py:186:28
|
186 | dynamic_shapes = tuple(dynamic_shapes)
| ^^^^^^^^^^^^^^
|
ERROR Argument `str | None` is not assignable to parameter `package` with type `ModuleType | str` in function `importlib.resources.is_resource` [bad-argument-type]
--> torch/_export/serde/schema_check.py:626:40
|
626 | if importlib.resources.is_resource(__package__, "schema.yaml"):
| ^^^^^^^^^^^
|
ERROR Argument `str | None` is not assignable to parameter `package` with type `ModuleType | str` in function `importlib.resources.read_text` [bad-argument-type]
--> torch/_export/serde/schema_check.py:627:49
|
627 | content = importlib.resources.read_text(__package__, "schema.yaml")
| ^^^^^^^^^^^
|
ERROR Argument `str | None` is not assignable to parameter `package` with type `ModuleType | str` in function `importlib.resources.read_text` [bad-argument-type]
--> torch/_export/serde/schema_check.py:634:13
|
634 | __package__, "export_schema.thrift"
| ^^^^^^^^^^^
|
ERROR Object of class `NoneType` has no attribute `replace` [missing-attribute]
--> torch/_export/serde/schema_check.py:657:17
|
657 | yaml_path = __package__.replace(".", "/") + "/schema.yaml"
| ^^^^^^^^^^^^^^^^^^^
|
ERROR Object of class `NoneType` has no attribute `replace` [missing-attribute]
--> torch/_export/serde/schema_check.py:658:26
|
658 | thrift_schema_path = __package__.replace(".", "/") + "/export_schema.thrift"
| ^^^^^^^^^^^^^^^^^^^
|
ERROR Returned type `FakeTensor | Parameter` is not assignable to declared return type `FakeTensor` [bad-return]
--> torch/_export/serde/serialize.py:386:12
|
386 | return fake_tensor
| ^^^^^^^^^^^
|
ERROR Argument `SymBoolArgument | SymFloatArgument | SymIntArgument | Unknown` is not assignable to parameter `arg` with type `SymFloatArgument | SymIntArgument | TensorArgument` in function `GraphModuleDeserializer.generate_getitem` [bad-argument-type]
--> torch/_export/serde/serialize.py:2743:58
|
2743 | self.generate_getitem(meta_val, fx_node, arg, 0, deserialized_metadata)
| ^^^
|
ERROR Object of class `type` has no attribute `create` [missing-attribute]
--> torch/_export/serde/serialize.py:3168:16
|
3168 | return cls.create(**{_type: _dict_to_dataclass(field_type, _value)})
| ^^^^^^^^^^
|
ERROR No matching overload found for function `dict.__init__` [no-matching-overload]
--> torch/_export/serde/serialize.py:3474:32
|
3474 | sorted_tensor_values = dict(
| ________________________________^
3475 | | sorted(graph.tensor_values.items(), key=operator.itemgetter(0))
3476 | | )
| |_____^
|
Possible overloads:
() -> None [closest match]
(**kwargs: _VT) -> None
(map: SupportsKeysAndGetItem[_KT, _VT], /) -> None
(map: SupportsKeysAndGetItem[str, _VT], /, **kwargs: _VT) -> None
(iterable: Iterable[tuple[_KT, _VT]], /) -> None
(iterable: Iterable[tuple[str, _VT]], /, **kwargs: _VT) -> None
(iterable: Iterable[list[str]], /) -> None
(iterable: Iterable[list[bytes]], /) -> None
ERROR No matching overload found for function `dict.__init__` [no-matching-overload]
--> torch/_export/serde/serialize.py:3477:33
|
3477 | sorted_sym_int_values = dict(
| _________________________________^
3478 | | sorted(graph.sym_int_values.items(), key=operator.itemgetter(0))
3479 | | )
| |_____^
|
Possible overloads:
() -> None [closest match]
(**kwargs: _VT) -> None
(map: SupportsKeysAndGetItem[_KT, _VT], /) -> None
(map: SupportsKeysAndGetItem[str, _VT], /, **kwargs: _VT) -> None
(iterable: Iterable[tuple[_KT, _VT]], /) -> None
(iterable: Iterable[tuple[str, _VT]], /, **kwargs: _VT) -> None
(iterable: Iterable[list[str]], /) -> None
(iterable: Iterable[list[bytes]], /) -> None
ERROR No matching overload found for function `dict.__init__` [no-matching-overload]
--> torch/_export/serde/serialize.py:3480:35
|
3480 | sorted_sym_float_values = dict(
| ___________________________________^
3481 | | sorted(graph.sym_float_values.items(), key=operator.itemgetter(0))
3482 | | )
| |_____^
|
Possible overloads:
() -> None [closest match]
(**kwargs: _VT) -> None
(map: SupportsKeysAndGetItem[_KT, _VT], /) -> None
(map: SupportsKeysAndGetItem[str, _VT], /, **kwargs: _VT) -> None
(iterable: Iterable[tuple[_KT, _VT]], /) -> None
(iterable: Iterable[tuple[str, _VT]], /, **kwargs: _VT) -> None
(iterable: Iterable[list[str]], /) -> None
(iterable: Iterable[list[bytes]], /) -> None
ERROR No matching overload found for function `dict.__init__` [no-matching-overload]
--> torch/_export/serde/serialize.py:3483:34
|
3483 | sorted_sym_bool_values = dict(
| __________________________________^
3484 | | sorted(graph.sym_bool_values.items(), key=operator.itemgetter(0))
3485 | | )
| |_____^
|
Possible overloads:
() -> None [closest match]
(**kwargs: _VT) -> None
(map: SupportsKeysAndGetItem[_KT, _VT], /) -> None
(map: SupportsKeysAndGetItem[str, _VT], /, **kwargs: _VT) -> None
(iterable: Iterable[tuple[_KT, _VT]], /) -> None
(iterable: Iterable[tuple[str, _VT]], /, **kwargs: _VT) -> None
(iterable: Iterable[list[str]], /) -> None
(iterable: Iterable[list[bytes]], /) -> None
ERROR No matching overload found for function `dict.__init__` [no-matching-overload]
--> torch/_export/serde/serialize.py:3486:36
|
3486 | sorted_custom_obj_values = dict(
| ____________________________________^
3487 | | sorted(graph.custom_obj_values.items(), key=operator.itemgetter(0))
3488 | | )
| |_____^
|
Possible overloads:
() -> None [closest match]
(**kwargs: _VT) -> None
(map: SupportsKeysAndGetItem[_KT, _VT], /) -> None
(map: SupportsKeysAndGetItem[str, _VT], /, **kwargs: _VT) -> None
(iterable: Iterable[tuple[_KT, _VT]], /) -> None
(iterable: Iterable[tuple[str, _VT]], /, **kwargs: _VT) -> None
(iterable: Iterable[list[str]], /) -> None
(iterable: Iterable[list[bytes]], /) -> None
ERROR `constants` cannot be annotated with `set[str]`, it is already defined with type `set[str] | None` [annotation-mismatch]
--> torch/_export/serde/serialize.py:3542:5
|
3542 | constants: set[str] = constants or set()
| ^^^^^^^^^
|
WARN `_register_pytree_node` is deprecated [deprecated]
--> torch/_export/utils.py:39:5
|
39 | _register_pytree_node,
| ---------------------
|
ERROR Argument `Tensor | float | int | str | Unknown` is not assignable to parameter `arg` with type `int` in function `_check_symint` [bad-argument-type]
--> torch/_export/utils.py:473:27
|
473 | node_val, arg, range_constraints, unification_map, key_path, None
| ^^^
|
ERROR Cannot index into `dict[str, str]` [index-error]
--> torch/_export/utils.py:1118:36
|
1118 | spec.target = name_map[spec.target][4:] # strip obj_ prefix
| ^^^^^^^^^^^
|
Argument `str | None` is not assignable to parameter `key` with type `str` in function `dict.__getitem__`
ERROR Cannot index into `dict[str, str]` [index-error]
--> torch/_export/utils.py:1124:36
|
1124 | spec.target = name_map[spec.target]
| ^^^^^^^^^^^
|
Argument `str | None` is not assignable to parameter `key` with type `str` in function `dict.__getitem__`
ERROR Class member `AssociativeScanOp.gen_schema` overrides parent class `HigherOrderOperator` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/associative_scan.py:99:9
|
99 | def gen_schema(self, combine_fn, xs, additional_inputs):
| ^^^^^^^^^^
|
`AssociativeScanOp.gen_schema` has type `BoundMethod[AssociativeScanOp, (self: AssociativeScanOp, combine_fn: Unknown, xs: Unknown, additional_inputs: Unknown) -> Unknown]`, which is not assignable to `BoundMethod[AssociativeScanOp, (self: AssociativeScanOp, *args: Unknown, **kwargs: Unknown) -> Unknown]`, the type of `HigherOrderOperator.gen_schema`
ERROR Class member `AssociativeScanAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/associative_scan.py:651:9
|
651 | def forward(
| ^^^^^^^
|
`AssociativeScanAutogradOp.forward` has type `(ctx: Unknown, combine_fn: Unknown, num_xs: Unknown, num_additional_inputs: Unknown, *operands: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR `HigherOrderOperator | OpOverload[Ellipsis, Any]` is not assignable to variable `op` with type `HopInstance | OpOverload[Ellipsis, Any]` [bad-assignment]
--> torch/_higher_order_ops/auto_functionalize.py:612:10
|
612 | op = op._op if isinstance(op, HopInstance) else op
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Class member `BaseHOP.gen_schema` overrides parent class `HigherOrderOperator` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/base_hop.py:173:9
|
173 | def gen_schema(self, subgraph, *operands, **kwargs):
| ^^^^^^^^^^
|
`BaseHOP.gen_schema` has type `BoundMethod[BaseHOP, (self: BaseHOP, subgraph: Unknown, *operands: Unknown, **kwargs: Unknown) -> Unknown]`, which is not assignable to `BoundMethod[BaseHOP, (self: BaseHOP, *args: Unknown, **kwargs: Unknown) -> Unknown]`, the type of `HigherOrderOperator.gen_schema`
ERROR Class member `BaseHOPFunction.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/base_hop.py:217:9
|
217 | def forward(ctx, hop, subgraph, kwargs, *operands):
| ^^^^^^^
|
`BaseHOPFunction.forward` has type `(ctx: Unknown, hop: Unknown, subgraph: Unknown, kwargs: Unknown, *operands: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Class member `CondOp.gen_schema` overrides parent class `HigherOrderOperator` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/cond.py:55:9
|
55 | def gen_schema(self, pred, true_fn, false_fn, operands):
| ^^^^^^^^^^
|
`CondOp.gen_schema` has type `BoundMethod[CondOp, (self: CondOp, pred: Unknown, true_fn: Unknown, false_fn: Unknown, operands: Unknown) -> Unknown]`, which is not assignable to `BoundMethod[CondOp, (self: CondOp, *args: Unknown, **kwargs: Unknown) -> Unknown]`, the type of `HigherOrderOperator.gen_schema`
ERROR Class member `CondAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/cond.py:287:9
|
287 | def forward(
| ^^^^^^^
|
`CondAutogradOp.forward` has type `(ctx: Unknown, pred: Unknown, true_fn: Unknown, false_fn: Unknown, *operands: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Argument `list[Any] | Any | None` is not assignable to parameter `args` with type `tuple[Any]` in function `torch._subclasses.functional_tensor.PythonFunctionalizeAPI.wrap_tensors` [bad-argument-type]
--> torch/_higher_order_ops/effects.py:301:29
|
301 | return ctx.wrap_tensors(unwrapped_outs)
| ^^^^^^^^^^^^^^
|
ERROR Object of class `Tracer` has no attribute `unwrap_proxy` [missing-attribute]
--> torch/_higher_order_ops/flex_attention.py:357:34
|
357 | proxy_args = pytree.tree_map(proxy_mode.tracer.unwrap_proxy, node_args)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `PythonKeyTracer | Tracer` is not assignable to parameter `tracer` with type `PythonKeyTracer | _GraphAppendingTracerEx` in function `torch.fx.experimental.proxy_tensor.track_tensor_tree` [bad-argument-type]
--> torch/_higher_order_ops/flex_attention.py:362:55
|
362 | example_out, out_proxy, constant=None, tracer=proxy_mode.tracer
| ^^^^^^^^^^^^^^^^^
|
ERROR Class member `FlexAttentionAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/flex_attention.py:624:9
|
624 | def forward(
| ^^^^^^^
|
`FlexAttentionAutogradOp.forward` has type `(ctx: Any, query: Tensor, key: Tensor, value: Tensor, fw_graph: (...) -> Unknown, joint_graph: (...) -> Unknown, block_mask: tuple[Any, ...], scale: float, kernel_options: dict[str, Any], mask_mod_other_buffers: tuple[Any, ...], *score_mod_other_buffers: tuple[Any, ...]) -> tuple[Tensor, Tensor, Tensor]`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Object of class `Tracer` has no attribute `unwrap_proxy` [missing-attribute]
--> torch/_higher_order_ops/flex_attention.py:1066:34
|
1066 | proxy_args = pytree.tree_map(proxy_mode.tracer.unwrap_proxy, node_args)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `PythonKeyTracer | Tracer` is not assignable to parameter `tracer` with type `PythonKeyTracer | _GraphAppendingTracerEx` in function `torch.fx.experimental.proxy_tensor.track_tensor_tree` [bad-argument-type]
--> torch/_higher_order_ops/flex_attention.py:1075:55
|
1075 | example_out, out_proxy, constant=None, tracer=proxy_mode.tracer
| ^^^^^^^^^^^^^^^^^
|
ERROR Class member `InvokeSubgraphHOP.gen_schema` overrides parent class `HigherOrderOperator` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/invoke_subgraph.py:89:9
|
89 | def gen_schema(self, subgraph, identifier, *operands):
| ^^^^^^^^^^
|
`InvokeSubgraphHOP.gen_schema` has type `BoundMethod[InvokeSubgraphHOP, (self: InvokeSubgraphHOP, subgraph: Unknown, identifier: Unknown, *operands: Unknown) -> Unknown]`, which is not assignable to `BoundMethod[InvokeSubgraphHOP, (self: InvokeSubgraphHOP, *args: Unknown, **kwargs: Unknown) -> Unknown]`, the type of `HigherOrderOperator.gen_schema`
ERROR Class member `InvokeSubgraphAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/invoke_subgraph.py:404:9
|
404 | def forward(
| ^^^^^^^
|
`InvokeSubgraphAutogradOp.forward` has type `(ctx: Unknown, subgraph: Unknown, identifier: Unknown, output_metadata: Unknown, *operands: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR `tuple[object, ...]` is not assignable to variable `tangent_metadata` with type `list[object]` [bad-assignment]
--> torch/_higher_order_ops/invoke_subgraph.py:480:28
|
480 | tangent_metadata = tuple(tangent_metadata)
| ^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Class member `LocalMapAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/local_map.py:202:9
|
202 | def forward(
| ^^^^^^^
|
`LocalMapAutogradOp.forward` has type `(ctx: Any, fw_gm: GraphModule, bw_gm: GraphModule, num_fw_ins: int, num_fw_outs: int, filtered_grads_idx: set[int], *args: Any, **kwargs: Any) -> tuple[Tensor | None, ...]`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Argument `tuple[Tensor]` is not assignable to parameter `x` with type `Tensor` in function `torch._functorch._aot_autograd.runtime_wrappers.coerce_to_expected_memory_format` [bad-argument-type]
--> torch/_higher_order_ops/local_map.py:248:61
|
248 | grads[i] = coerce_to_expected_memory_format(grads[i], meta)
| ^^^^^^^^
|
ERROR Class member `MapAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/map.py:128:9
|
128 | def forward(ctx, f, num_mapped_args, *flat_args):
| ^^^^^^^
|
`MapAutogradOp.forward` has type `(ctx: Unknown, f: Unknown, num_mapped_args: Unknown, *flat_args: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Class member `ScanOp.gen_schema` overrides parent class `HigherOrderOperator` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/scan.py:244:9
|
244 | def gen_schema(self, combine_fn, init, xs, additional_inputs):
| ^^^^^^^^^^
|
`ScanOp.gen_schema` has type `BoundMethod[ScanOp, (self: ScanOp, combine_fn: Unknown, init: Unknown, xs: Unknown, additional_inputs: Unknown) -> Unknown]`, which is not assignable to `BoundMethod[ScanOp, (self: ScanOp, *args: Unknown, **kwargs: Unknown) -> Unknown]`, the type of `HigherOrderOperator.gen_schema`
ERROR Class member `ScanAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/scan.py:451:9
|
451 | def forward(
| ^^^^^^^
|
`ScanAutogradOp.forward` has type `(ctx: Unknown, hop_partitioned_graph: Unknown, n_init: Unknown, n_xs: Unknown, n_additional_inputs: Unknown, *operands: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR Argument `tuple[str, tuple[list[SymInt | int]]] | tuple[str, tuple[list[SymInt | int], list[SymInt | int], SymInt | int]] | None` is not assignable to parameter `tma_meta` with type `tuple[str, tuple[list[SymInt | int]]] | tuple[str, tuple[list[SymInt | int], list[SymInt | int], SymInt | int]]` in function `maybe_unpack_tma_stable_metadata` [bad-argument-type]
--> torch/_higher_order_ops/triton_kernel_wrap.py:295:17
|
295 | tma_descriptor_metadata.get(name, None)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR `(...) -> Unknown` is not subscriptable [unsupported-operation]
--> torch/_higher_order_ops/triton_kernel_wrap.py:428:37
|
428 | attrvals.append(spec[1])
| ^^^^^^^
|
ERROR Expected a callable, got `None` [not-callable]
--> torch/_higher_order_ops/triton_kernel_wrap.py:446:20
|
446 | return mangle_type(arg)
| ^^^^^^^^^^^
|
ERROR Argument `str | None` is not assignable to parameter `fn_name` with type `str` in function `MemoizeWithCycleCheck.__call__` [bad-argument-type]
--> torch/_higher_order_ops/triton_kernel_wrap.py:818:56
|
818 | tma_stores = get_tma_stores(functions, op.fn_call_name)
| ^^^^^^^^^^^^^^^
|
ERROR Argument `str | None` is not assignable to parameter `fn_name` with type `str` in function `MemoizeWithCycleCheck.__call__` [bad-argument-type]
--> torch/_higher_order_ops/triton_kernel_wrap.py:898:32
|
898 | functions, op.fn_call_name, len(op.args)
| ^^^^^^^^^^^^^^^
|
ERROR Object of class `Autotuner` has no attribute `fn`
Object of class `JITFunction` has no attribute `fn` [missing-attribute]
--> torch/_higher_order_ops/triton_kernel_wrap.py:951:16
|
951 | assert kernel.fn.__name__ in kernel_name
| ^^^^^^^^^
|
ERROR Object of class `Autotuner` has no attribute `fn`
Object of class `JITFunction` has no attribute `fn` [missing-attribute]
--> torch/_higher_order_ops/triton_kernel_wrap.py:1054:13
|
1054 | kernel.fn.__name__, kernel.configs, grid
| ^^^^^^^^^
|
ERROR Object of class `Autotuner` has no attribute `configs`
Object of class `JITFunction` has no attribute `configs` [missing-attribute]
--> torch/_higher_order_ops/triton_kernel_wrap.py:1054:33
|
1054 | kernel.fn.__name__, kernel.configs, grid
| ^^^^^^^^^^^^^^
|
ERROR Object of class `Autotuner` has no attribute `arg_names`
Object of class `JITFunction` has no attribute `arg_names` [missing-attribute]
--> torch/_higher_order_ops/triton_kernel_wrap.py:1103:17
|
1103 | for name in kernel.arg_names:
| ^^^^^^^^^^^^^^^^
|
ERROR Cannot index into `Autotuner` [index-error]
--> torch/_higher_order_ops/triton_kernel_wrap.py:1111:5
|
1111 | kernel[grid_fn](*args, **kwargs, **constant_args)
| ^^^^^^^^^^^^^^^
|
Object of class `Autotuner` has no attribute `__getitem__`
ERROR Cannot index into `JITFunction` [index-error]
--> torch/_higher_order_ops/triton_kernel_wrap.py:1111:5
|
1111 | kernel[grid_fn](*args, **kwargs, **constant_args)
| ^^^^^^^^^^^^^^^
|
Object of class `JITFunction` has no attribute `__getitem__`
ERROR `((dict[str, int]) -> tuple[int, ...]) | tuple[SymInt | int | Unknown, ...] | None` is not assignable to attribute `grid` with type `((dict[str, int]) -> tuple[int, ...]) | tuple[SymInt | int | Unknown, ...]` [bad-assignment]
--> torch/_higher_order_ops/triton_kernel_wrap.py:1516:25
|
1516 | variable.grid = grid
| ^^^^
|
ERROR Object of class `Autotuner` has no attribute `run`
Object of class `JITFunction` has no attribute `run` [missing-attribute]
--> torch/_higher_order_ops/triton_kernel_wrap.py:2060:20
|
2060 | return self.kernel.run(*args, **kwargs)
| ^^^^^^^^^^^^^^^
|
ERROR Cannot index into `Autotuner` [index-error]
--> torch/_higher_order_ops/triton_kernel_wrap.py:2071:20
|
2071 | return self.kernel[self.grid](*args, **kwargs)
| ^^^^^^^^^^^^^^^^^^^^^^
|
Object of class `Autotuner` has no attribute `__getitem__`
ERROR Cannot index into `JITFunction` [index-error]
--> torch/_higher_order_ops/triton_kernel_wrap.py:2071:20
|
2071 | return self.kernel[self.grid](*args, **kwargs)
| ^^^^^^^^^^^^^^^^^^^^^^
|
Object of class `JITFunction` has no attribute `__getitem__`
ERROR `Literal[True]` is not assignable to attribute `allow_empty_graphs` with type `Literal[False]` [bad-assignment]
--> torch/_higher_order_ops/utils.py:273:51
|
273 | torch._dynamo.config.allow_empty_graphs = True
| ^^^^
|
ERROR `str | None` is not assignable to `None` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_higher_order_ops/utils.py:443:5
|
443 | / while not next_name:
444 | | candidate = f"{prefix}_{i}"
445 | | if hasattr(root, candidate):
446 | | i += 1
447 | | else:
448 | | next_name = candidate
| |_________________________________^
|
ERROR Could not import `prepare_fw_with_masks_all_requires_grad` from `torch._higher_order_ops.utils` [missing-module-attribute]
--> torch/_higher_order_ops/utils.py:799:47
|
799 | from torch._higher_order_ops.utils import prepare_fw_with_masks_all_requires_grad
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Returned type `tuple[dict[int, int] | list[int] | list[Any] | tuple[Any, ...], dict[int, int] | list[int] | list[Any] | tuple[Any, ...], dict[int, int] | list[int] | list[Any] | tuple[Any, ...], dict[int, int] | list[int] | list[Any] | tuple[Any, ...]]` is not assignable to declared return type `tuple[dict[int, int], dict[int, int], dict[int, int], list[int]]` [bad-return]
--> torch/_higher_order_ops/utils.py:943:12
|
943 | return inp_inp_alias_map, inp_out_alias_map, out_out_alias_map, mutated_inputs
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Class member `WhileLoopOp.gen_schema` overrides parent class `HigherOrderOperator` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/while_loop.py:57:9
|
57 | def gen_schema(self, cond_fn, body_fn, carried_inputs, additional_inputs):
| ^^^^^^^^^^
|
`WhileLoopOp.gen_schema` has type `BoundMethod[WhileLoopOp, (self: WhileLoopOp, cond_fn: Unknown, body_fn: Unknown, carried_inputs: Unknown, additional_inputs: Unknown) -> Unknown]`, which is not assignable to `BoundMethod[WhileLoopOp, (self: WhileLoopOp, *args: Unknown, **kwargs: Unknown) -> Unknown]`, the type of `HigherOrderOperator.gen_schema`
ERROR Object of class `Tensor` has no attribute `constant` [missing-attribute]
--> torch/_higher_order_ops/while_loop.py:433:21
|
433 | x.constant = None
| ^^^^^^^^^^
|
ERROR `str | None` is not assignable to `None` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_higher_order_ops/while_loop.py:455:9
|
455 | / while not next_name:
456 | | candidate = f"while_loop_cond_graph_{i}"
457 | | if hasattr(proxy_mode.tracer.root, candidate):
458 | | i += 1
459 | | else:
460 | | next_name = candidate
| |_____________________________________^
|
ERROR Class member `WhileLoopAutogradOp.forward` overrides parent class `Function` in an inconsistent manner [bad-override]
--> torch/_higher_order_ops/while_loop.py:699:9
|
699 | def forward(
| ^^^^^^^
|
`WhileLoopAutogradOp.forward` has type `(ctx: Unknown, cond_fn: Unknown, body_fn: Unknown, num_carried_inputs: Unknown, num_additional_inputs: Unknown, *carries_and_inputs: Unknown) -> Unknown`, which is not consistent with `(*args: Any, **kwargs: Any) -> Any` in `Function.forward` (the type of read-write attributes cannot be changed)
ERROR `int | None` is not assignable to `None` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_higher_order_ops/while_loop.py:728:9
|
728 | / for out in fw_outputs:
729 | | if isinstance(out, torch.Tensor):
730 | | if loop_count is not None:
731 | | assert out.size(0) == loop_count
732 | | else:
733 | | loop_count = out.size(0)
| |____________________________________________^
|
ERROR Argument `tuple[Tensor, *tuple[Tensor | Unknown, ...]]` is not assignable to parameter `carried_inputs` with type `tuple[Tensor | bool | float | int]` in function `WhileLoopOp.__call__` [bad-argument-type]
--> torch/_higher_order_ops/while_loop.py:881:17
|
881 | / (
882 | | init_idx,
883 | | *init_grad_carries,
884 | | *init_grad_additional_inputs,
885 | | ),
| |_________________^
|
ERROR No matching overload found for function `torch._C._VariableFunctions.sum` [no-matching-overload]
--> torch/_refs/__init__.py:883:25
|
883 | return torch.sum(torch.exp(self), dim, keepdim).log()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(input: Tensor, *, dtype: dtype | None = None) -> Tensor [closest match]
(input: Tensor, dim: Size | int | list[int] | tuple[int, ...] | None, keepdim: bool = False, *, dtype: dtype | None = None, out: Tensor | None = None) -> Tensor
(input: Tensor, dim: Sequence[EllipsisType | str | None], keepdim: bool = False, *, dtype: dtype | None = None, out: Tensor | None = None) -> Tensor
ERROR Argument `Iterable[int] | list[int] | tuple[int] | tuple[int, ...]` is not assignable to parameter `dim` with type `Size | int | list[int] | tuple[int, ...]` in function `torch._C._VariableFunctions.amax` [bad-argument-type]
--> torch/_refs/__init__.py:884:42
|
884 | maxes = torch.amax(torch.real(self), dim, keepdim=True)
| ^^^
|
ERROR No matching overload found for function `torch._C._VariableFunctions.squeeze` [no-matching-overload]
--> torch/_refs/__init__.py:886:57
|
886 | maxes_squeezed = maxes if keepdim else torch.squeeze(maxes, dim)
| ^^^^^^^^^^^^
|
Possible overloads:
(input: Tensor) -> Tensor [closest match]
(input: Tensor, dim: int) -> Tensor
(input: Tensor, dim: Size | list[int] | tuple[int, ...]) -> Tensor
(input: Tensor, dim: EllipsisType | str | None) -> Tensor
ERROR No matching overload found for function `torch._C._VariableFunctions.sum` [no-matching-overload]
--> torch/_refs/__init__.py:887:23
|
887 | result = torch.sum(torch.exp(self - maxes), dim, keepdim)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(input: Tensor, *, dtype: dtype | None = None) -> Tensor [closest match]
(input: Tensor, dim: Size | int | list[int] | tuple[int, ...] | None, keepdim: bool = False, *, dtype: dtype | None = None, out: Tensor | None = None) -> Tensor
(input: Tensor, dim: Sequence[EllipsisType | str | None], keepdim: bool = False, *, dtype: dtype | None = None, out: Tensor | None = None) -> Tensor
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1244:27
|
1244 | b = scalar_tensor(b, dtype=a.dtype, device=a.device)
| ^
|
ERROR Argument `Tensor | bool | complex | float | int` is not assignable to parameter `x` with type `SupportsAbs[Tensor]` in function `abs` [bad-argument-type]
--> torch/_refs/__init__.py:1248:38
|
1248 | return where(signbit(b), neg(abs(a)), abs(a))
| ^
|
ERROR Argument `Tensor | bool | complex | float | int` is not assignable to parameter `x` with type `SupportsAbs[Tensor]` in function `abs` [bad-argument-type]
--> torch/_refs/__init__.py:1248:47
|
1248 | return where(signbit(b), neg(abs(a)), abs(a))
| ^
|
ERROR No matching overload found for function `torch._prims_common.wrappers._maybe_convert_to_dtype` [no-matching-overload]
--> torch/_refs/__init__.py:1333:32
|
1333 | a = _maybe_convert_to_dtype(a, dtype)
| ^^^^^^^^^^
|
Possible overloads:
(a: Tensor, dtype: dtype) -> Tensor [closest match]
(a: bool | complex | float | int, dtype: dtype) -> bool | complex | float | int
(a: Sequence[Unknown], dtype: dtype) -> Sequence[Unknown]
(a: None, dtype: dtype) -> None
ERROR No matching overload found for function `torch._prims_common.wrappers._maybe_convert_to_dtype` [no-matching-overload]
--> torch/_refs/__init__.py:1334:32
|
1334 | b = _maybe_convert_to_dtype(b, dtype)
| ^^^^^^^^^^
|
Possible overloads:
(a: Tensor, dtype: dtype) -> Tensor [closest match]
(a: bool | complex | float | int, dtype: dtype) -> bool | complex | float | int
(a: Sequence[Unknown], dtype: dtype) -> Sequence[Unknown]
(a: None, dtype: dtype) -> None
ERROR Returned type `Literal[1] | Unknown` is not assignable to declared return type `Tensor` [bad-return]
--> torch/_refs/__init__.py:1337:12
|
1337 | return pow(a, b)
| ^^^^^^^^^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1378:27
|
1378 | a = scalar_tensor(a)
| ^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1379:27
|
1379 | b = scalar_tensor(b)
| ^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1381:27
|
1381 | b = scalar_tensor(b, dtype=a.dtype, device=a.device)
| ^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1383:27
|
1383 | a = scalar_tensor(a, dtype=b.dtype, device=b.device)
| ^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1859:27
|
1859 | a = scalar_tensor(a, dtype=b.dtype, device=b.device)
| ^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:1861:27
|
1861 | b = scalar_tensor(b, dtype=a.dtype, device=a.device)
| ^
|
ERROR No matching overload found for function `torch._C._VariableFunctions.any` [no-matching-overload]
--> torch/_refs/__init__.py:2336:41
|
2336 | result = torch.logical_not(torch.any(torch.logical_not(a), dim, keepdim=keepdim))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(input: Tensor, *, out: Tensor | None = None) -> Tensor
(input: Tensor, dim: Size | list[int] | tuple[int, ...] | None = None, keepdim: bool = False, *, out: Tensor | None = None) -> Tensor [closest match]
(input: Tensor, dim: int, keepdim: bool = False, *, out: Tensor | None = None) -> Tensor
(input: Tensor, dim: EllipsisType | str | None, keepdim: bool = False, *, out: Tensor | None = None) -> Tensor
ERROR `Tensor | None` is not assignable to `None` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_refs/__init__.py:2853:5
|
2853 | / for i, t in enumerate(tensors):
2854 | | if example is None:
2855 | | if t.ndim != 1:
2856 | | example = t
2857 | | else:
2858 | | if t.ndim != 1:
| |____________________________^
|
ERROR No matching overload found for function `torch._prims_common.canonicalize_dims` [no-matching-overload]
--> torch/_refs/__init__.py:3231:40
|
3231 | norm_dims = utils.canonicalize_dims(a.ndim, norm_dims)
| ^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(rank: int, indices: Sequence[int], wrap_scalar: bool = True) -> tuple[int, ...] [closest match]
(rank: int, indices: int, wrap_scalar: bool = True) -> int
ERROR Argument `tuple[int, ...]` is not assignable to parameter `y` with type `Size` in function `torch.fx.experimental.symbolic_shapes.sym_eq` [bad-argument-type]
--> torch/_refs/__init__.py:3344:48
|
3344 | weight is None or sym_eq(weight.shape, tuple(normalized_shape)),
| ^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `tuple[int, ...]` is not assignable to parameter `y` with type `Size` in function `torch.fx.experimental.symbolic_shapes.sym_eq` [bad-argument-type]
--> torch/_refs/__init__.py:3352:44
|
3352 | bias is None or sym_eq(bias.shape, tuple(normalized_shape)),
| ^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `tuple[int, ...]` is not assignable to parameter `y` with type `Size` in function `torch.fx.experimental.symbolic_shapes.sym_eq` [bad-argument-type]
--> torch/_refs/__init__.py:3362:60
|
3362 | input.shape[(input.ndim - normalized_ndim) :], tuple(normalized_shape)
| ^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR No matching overload found for function `torch._prims_common.canonicalize_dims` [no-matching-overload]
--> torch/_refs/__init__.py:3956:35
|
3956 | dims = utils.canonicalize_dims(a.ndim, dims)
| ^^^^^^^^^^^^^^
|
Possible overloads:
(rank: int, indices: Sequence[int], wrap_scalar: bool = True) -> tuple[int, ...] [closest match]
(rank: int, indices: int, wrap_scalar: bool = True) -> int
ERROR Argument `Iterable[Unknown] | tuple[Unknown]` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
--> torch/_refs/__init__.py:3968:29
|
3968 | if a.dim() == 0 and len(dims) > 0:
| ^^^^
|
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/_refs/__init__.py:3970:39
|
3970 | f"Dimension specified as {dims[0]} but tensor has no dimensions"
| ^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Argument `Iterable[int] | list[int] | tuple[int] | tuple[int, ...]` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
--> torch/_refs/__init__.py:3973:22
|
3973 | len_shifts = len(shifts)
| ^^^^^^
|
ERROR Argument `Iterable[Unknown] | tuple[Unknown]` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
--> torch/_refs/__init__.py:3974:20
|
3974 | len_dims = len(dims)
| ^^^^
|
ERROR Argument `Iterable[int] | list[int] | tuple[int] | tuple[int, ...]` is not assignable to parameter `shifts` with type `Sequence[SymInt | int] | SymInt | int` in function `torch._C._VariableFunctions.roll` [bad-argument-type]
--> torch/_refs/__init__.py:3981:49
|
3981 | return torch.roll(torch.flatten(a), shifts, 0).view(a.shape)
| ^^^^^^
|
ERROR Cannot index into `Iterable[int]` [index-error]
--> torch/_refs/__init__.py:3987:23
|
3987 | tail_shifts = shifts[1:]
| ^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/_refs/__init__.py:3988:21
|
3988 | tail_dims = dims[1:]
| ^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[int]` [index-error]
--> torch/_refs/__init__.py:3989:43
|
3989 | first_dim_rolled = torch.roll(a, (shifts[0],), dims[0])
| ^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/_refs/__init__.py:3989:56
|
3989 | first_dim_rolled = torch.roll(a, (shifts[0],), dims[0])
| ^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/_refs/__init__.py:3994:11
|
3994 | dim = dims[0]
| ^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[int]` [index-error]
--> torch/_refs/__init__.py:3996:21
|
3996 | start = (size - shifts[0]) % size
| ^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR No matching overload found for function `sum` [no-matching-overload]
--> torch/_refs/__init__.py:4077:31
|
4077 | true_divide(a_exp, sum(a_exp, dim, keepdim=True)), result_dtype
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(iterable: Iterable[Literal[-20, -19, -18, -17, -16, -15, -14, -13, -12, -11, -10, -9, -8, -7, -6, -5, -4, -3, -2, -1, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25] | bool], /, start: int = 0) -> int [closest match]
(iterable: Iterable[_SupportsSumNoDefaultT], /) -> Literal[0] | _SupportsSumNoDefaultT
(iterable: Iterable[_AddableT1], /, start: _AddableT2) -> _AddableT1 | _AddableT2
ERROR No matching overload found for function `torch._prims_common.canonicalize_dims` [no-matching-overload]
--> torch/_refs/__init__.py:4254:34
|
4254 | dim = utils.canonicalize_dims(ndim, dim)
| ^^^^^^^^^^^
|
Possible overloads:
(rank: int, indices: Sequence[int], wrap_scalar: bool = True) -> tuple[int, ...] [closest match]
(rank: int, indices: int, wrap_scalar: bool = True) -> int
ERROR `%` is not supported between `int` and `SymInt` [unsupported-operation]
--> torch/_refs/__init__.py:4394:34
|
4394 | (split_size != 0 and a.shape[dim] % split_size == 0),
| ^^^^^^^^^^^^^^^^^^^^^^^^^
|
Argument `SymInt` is not assignable to parameter `value` with type `int` in function `int.__mod__`
ERROR Argument `SymInt | int` is not assignable to parameter `indices_or_sections` with type `Tensor | int | list[int] | tuple[int, ...]` in function `tensor_split` [bad-argument-type]
--> torch/_refs/__init__.py:4405:32
|
4405 | return tensor_split(a, split_size, dim)
| ^^^^^^^^^^
|
ERROR `%` is not supported between `int` and `SymInt` [unsupported-operation]
--> torch/_refs/__init__.py:4435:34
|
4435 | (split_size != 0 and a.shape[0] % split_size == 0),
| ^^^^^^^^^^^^^^^^^^^^^^^
|
Argument `SymInt` is not assignable to parameter `value` with type `int` in function `int.__mod__`
ERROR Argument `SymInt | int` is not assignable to parameter `indices_or_sections` with type `Tensor | int | list[int] | tuple[int, ...]` in function `tensor_split` [bad-argument-type]
--> torch/_refs/__init__.py:4445:32
|
4445 | return tensor_split(a, split_size, 0)
| ^^^^^^^^^^
|
ERROR `%` is not supported between `int` and `SymInt` [unsupported-operation]
--> torch/_refs/__init__.py:4649:60
|
4649 | if isinstance(sections, IntLike) and (sections == 0 or a.shape[2] % sections != 0):
| ^^^^^^^^^^^^^^^^^^^^^
|
Argument `SymInt` is not assignable to parameter `value` with type `int` in function `int.__mod__`
ERROR Implementation signature `(*tensors: Tensor | list[Tensor] | tuple[Tensor], *, indexing: str) -> list[Tensor]` does not accept all arguments that overload signature `(tensors: Sequence[Tensor], indexing: str) -> Unknown` accepts [inconsistent-overload]
--> torch/_refs/__init__.py:5422:5
|
5422 | def meshgrid(tensors: Sequence[TensorLikeType], indexing: str):
| ^^^^^^^^
|
ERROR No matching overload found for function `torch._prims_common.wrappers._maybe_convert_to_dtype` [no-matching-overload]
--> torch/_refs/__init__.py:5848:36
|
5848 | value = _maybe_convert_to_dtype(value, a.dtype)
| ^^^^^^^^^^^^^^^^
|
Possible overloads:
(a: Tensor, dtype: dtype) -> Tensor [closest match]
(a: bool | complex | float | int, dtype: dtype) -> bool | complex | float | int
(a: Sequence[Unknown], dtype: dtype) -> Sequence[Unknown]
(a: None, dtype: dtype) -> None
ERROR `dtype | Unknown | None` is not assignable to `None` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/_refs/__init__.py:6642:9
|
6642 | / for i in range(length):
6643 | | cur_item = obj[i]
6644 | | # TODO: test this
6645 | | """
6646 | | if cur_item is obj:
6647 | | raise TypeError("new(): self-referential lists are incompatible")
| |__________________________________________________________________________________^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `s` with type `bool | complex | float | int` in function `torch._C._VariableFunctions.scalar_tensor` [bad-argument-type]
--> torch/_refs/__init__.py:6679:36
|
6679 | return torch.scalar_tensor(obj, dtype=scalarType)
| ^^^
|
ERROR Cannot set item in `Sized` [unsupported-operation]
--> torch/_refs/fft.py:109:13
|
109 | pad_amount[pad_idx] = sizes[i] - x_sizes[dims[i]]
| ^^^^^^^^^^^^^^^^^^^
|
Object of class `Sized` has no attribute `__setitem__`
ERROR Argument `Sized | list[int]` is not assignable to parameter `pad` with type `Sequence[SymInt | int]` in function `torch._C._VariableFunctions.constant_pad_nd` [bad-argument-type]
--> torch/_refs/fft.py:114:37
|
114 | return torch.constant_pad_nd(x, pad_amount) if must_copy else x
| ^^^^^^^^^^
|
ERROR No matching overload found for function `torch._prims_common.canonicalize_dims` [no-matching-overload]
--> torch/_refs/linalg/__init__.py:219:34
|
219 | dim = utils.canonicalize_dims(A.ndim, dim)
| ^^^^^^^^^^^^^
|
Possible overloads:
(rank: int, indices: Sequence[int], wrap_scalar: bool = True) -> tuple[int, ...] [closest match]
(rank: int, indices: int, wrap_scalar: bool = True) -> int
ERROR Index 1 out of range for tuple with 1 elements [index-error]
--> torch/_refs/linalg/__init__.py:226:19
|
226 | dim[0] != dim[1],
| ^^^^^^
|
ERROR Index 1 out of range for tuple with 1 elements [index-error]
--> torch/_refs/linalg/__init__.py:227:79
|
227 | lambda: f"linalg.matrix_norm: dims must be different. Got ({dim[0]}, {dim[1]})",
| ^^^^^^
|
ERROR Index 1 out of range for tuple with 1 elements [index-error]
--> torch/_refs/linalg/__init__.py:248:51
|
248 | perm = _backshift_permutation(dim[0], dim[1], A.ndim)
| ^^^^^^
|
ERROR Index 1 out of range for tuple with 1 elements [index-error]
--> torch/_refs/linalg/__init__.py:271:51
|
271 | perm = _backshift_permutation(dim[0], dim[1], A.ndim)
| ^^^^^^
|
ERROR Cannot unpack tuple[int] | Unknown (of size 1) into 2 values [bad-unpacking]
--> torch/_refs/linalg/__init__.py:278:13
|
278 | dim0, dim1 = dim
| ^^^^^^^^^^
|
ERROR `Args[_P]` is not subscriptable [unsupported-operation]
--> torch/_refs/nn/functional/__init__.py:145:13
|
145 | a = args[0]
| ^^^^^^^
|
ERROR `Kwargs[_P]` is not subscriptable [unsupported-operation]
--> torch/_refs/nn/functional/__init__.py:148:12
|
148 | if kwargs["inplace"]:
| ^^^^^^^^^^^^^^^^^
|
ERROR `**` is not supported between `Tensor` and `Literal[2]` [unsupported-operation]
--> torch/_refs/nn/functional/__init__.py:628:47
|
628 | loss = torch.where(loss < beta, 0.5 * loss**2 / beta, loss - 0.5 * beta)
| ^^^^^^^
|
Argument `Literal[2]` is not assignable to parameter with type `TensorBase`
ERROR `**` is not supported between `Tensor` and `Literal[2]` [unsupported-operation]
--> torch/_refs/nn/functional/__init__.py:628:47
|
628 | loss = torch.where(loss < beta, 0.5 * loss**2 / beta, loss - 0.5 * beta)
| ^^^^^^^
|
Expected 1 more positional argument
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `torch._refs.scalar_tensor` [bad-argument-type]
--> torch/_refs/special/__init__.py:158:32
|
158 | b = refs.scalar_tensor(b, dtype=a.dtype, device=a.device)
| ^
|
ERROR Argument `SymBool | SymFloat | SymInt | bool | complex | float | int` is not assignable to parameter `a` with type `bool | complex | float | int` in function `torch._refs.scalar_tensor` [bad-argument-type]
--> torch/_refs/special/__init__.py:160:32
|
160 | a = refs.scalar_tensor(a, dtype=b.dtype, device=b.device)
| ^
|
ERROR No matching overload found for function `max` [no-matching-overload]
--> torch/jit/_decompositions.py:133:21
|
133 | return sum / max(0, denom)
| ^^^^^^^^^^
|
Possible overloads:
(arg1: SupportsRichComparisonT, arg2: SupportsRichComparisonT, /, *_args: SupportsRichComparisonT, *, key: None = None) -> SupportsRichComparisonT [closest match]
(arg1: _T, arg2: _T, /, *_args: _T, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None) -> SupportsRichComparisonT
(iterable: Iterable[_T], /, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None, default: _T) -> SupportsRichComparisonT | _T
(iterable: Iterable[_T1], /, *, key: (_T1) -> SupportsDunderGT[Any] | SupportsDunderLT[Any], default: _T2) -> _T1 | _T2
ERROR Could not find import of `monkeytype` [import-error]
--> torch/jit/_monkeytype_config.py:17:5
|
17 | from monkeytype import trace as monkeytype_trace
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Looked in these locations (from config in `/Users/maggiemoss/python_projects/pytorch/pyrefly.toml`):
Import root (inferred from project layout): "/Users/maggiemoss/python_projects/pytorch"
Site package path queried from interpreter: ["/Users/maggiemoss/.pyenv/versions/3.12.0/lib/python3.12", "/Users/maggiemoss/.pyenv/versions/3.12.0/lib/python3.12/lib-dynload", "/Users/maggiemoss/python_projects/pytorch/venv/lib/python3.12/site-packages", "/Users/maggiemoss/python_projects/pytorch"]
ERROR Object of class `JitTypeTraceStoreLogger` has no attribute `traces` [missing-attribute]
--> torch/jit/_monkeytype_config.py:90:13
|
90 | self.traces.append(trace)
| ^^^^^^^^^^^
|
ERROR Expected 0 positional arguments, got 1 in function `JitTypeTraceStoreLogger.__init__` [bad-argument-count]
--> torch/jit/_monkeytype_config.py:151:44
|
151 | return JitTypeTraceStoreLogger(self.trace_store())
| ^^^^^^^^^^^^^^^^^^
|
ERROR Object of class `FunctionType` has no attribute `__func__` [missing-attribute]
--> torch/jit/_recursive.py:750:16
|
750 | if item.__func__ in method_overloads:
| ^^^^^^^^^^^^^
|
ERROR Expected class object, got `(value: Unknown, type: Unknown) -> Unknown` [invalid-argument]
--> torch/jit/_script.py:548:38
|
548 | if isinstance(value, Attribute):
| ^^^^^^^^^
|
ERROR Class `RecursiveScriptModule` has no class attribute `_finalize_scriptmodule` [missing-attribute]
--> torch/jit/_script.py:659:13
|
659 | RecursiveScriptModule._finalize_scriptmodule(script_module)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Class `RecursiveScriptModule` has no class attribute `_construct` [missing-attribute]
--> torch/jit/_script.py:932:20
|
932 | return RecursiveScriptModule._construct(
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Instance-only attribute `__dict__` of class `RecursiveScriptModule` is not visible on the class [missing-attribute]
--> torch/jit/_script.py:941:23
|
941 | for name, item in RecursiveScriptModule.__dict__.items():
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Instance-only attribute `__dict__` of class `RecursiveScriptModule` is not visible on the class [missing-attribute]
--> torch/jit/_script.py:1009:25
|
1009 | name not in RecursiveScriptModule.__dict__
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Expected a callable, got `Tensor` [not-callable]
--> torch/jit/_script.py:1041:9
|
1041 | obj.__prepare_scriptable__() if hasattr(obj, "__prepare_scriptable__") else obj
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Expected 0 positional arguments, got 1 in function `torch.jit._monkeytype_config.JitTypeTraceConfig.__init__` [bad-argument-count]
--> torch/jit/_script.py:1138:52
|
1138 | monkeytype_config = JitTypeTraceConfig(type_trace_db)
| ^^^^^^^^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/_serialization.py:169:26
|
169 | cu, os.fspath(f), map_location, _extra_files, _restore_shapes
| ^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Expected 4 positional arguments, got 5 in function `torch._C.import_ir_module` [bad-argument-count]
--> torch/jit/_serialization.py:169:59
|
169 | cu, os.fspath(f), map_location, _extra_files, _restore_shapes
| ^^^^^^^^^^^^^^^
|
ERROR Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/_serialization.py:173:17
|
173 | cu, f.read(), map_location, _extra_files, _restore_shapes
| ^^^^^^
|
ERROR Expected 4 positional arguments, got 5 in function `torch._C.import_ir_module_from_buffer` [bad-argument-count]
--> torch/jit/_serialization.py:173:55
|
173 | cu, f.read(), map_location, _extra_files, _restore_shapes
| ^^^^^^^^^^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/_serialization.py:199:22
|
199 | f = os.fspath(f)
| ^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/_serialization.py:248:22
|
248 | f = os.fspath(f)
| ^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Argument `SupportsIndex` is not assignable to parameter `dimension` with type `int` in function `check_cat_shape_except_dim` [bad-argument-type]
--> torch/jit/_shape_functions.py:564:68
|
564 | check_cat_shape_except_dim(not_skipped_tensor, tensor, dim, i)
| ^^^
|
ERROR Class `function` has no class attribute `_nested_map` [missing-attribute]
--> torch/jit/_trace.py:172:12
|
172 | return function._nested_map(
| ^^^^^^^^^^^^^^^^^^^^
|
ERROR Object of class `tuple` has no attribute `items` [missing-attribute]
--> torch/jit/_trace.py:338:31
|
338 | for name, data in inputs.items():
| ^^^^^^^^^^^^
|
ERROR Argument `Unknown | None` is not assignable to parameter `iterable` with type `Iterable[@_]` in function `tuple.__new__` [bad-argument-type]
--> torch/jit/_trace.py:742:32
|
742 | example_inputs = tuple(example_inputs)
| ^^^^^^^^^^^^^^
|
ERROR Argument `tuple[Unknown, ...] | tuple[Tensor | dict[Unknown, Unknown]] | Unknown | None` is not assignable to parameter `input_tuple` with type `tuple[Any, ...]` in function `torch._C._create_function_from_trace` [bad-argument-type]
--> torch/jit/_trace.py:768:13
|
768 | example_inputs,
| ^^^^^^^^^^^^^^
|
ERROR Could not import `BroadcastingList2` from `torch._jit_internal` [missing-module-attribute]
--> torch/jit/annotations.py:41:5
|
41 | BroadcastingList2,
| ^^^^^^^^^^^^^^^^^
|
ERROR Could not import `BroadcastingList3` from `torch._jit_internal` [missing-module-attribute]
--> torch/jit/annotations.py:42:5
|
42 | BroadcastingList3,
| ^^^^^^^^^^^^^^^^^
|
ERROR Cannot set item in `dict[str, type[Future] | type[Tensor] | type[_Await] | type[dict] | type[list] | Module | type[Optional] | type[Tuple] | type[Union]]` [unsupported-operation]
--> torch/jit/annotations.py:101:32
|
101 | self.env["RRef"] = RRef
| ^^^^
|
Argument `type[RRef]` is not assignable to parameter `value` with type `type[Future] | type[Tensor] | type[_Await] | type[dict] | type[list] | Module | type[Optional] | type[Tuple] | type[Union]` in function `dict.__setitem__`
ERROR No matching overload found for function `typing.MutableMapping.update` [no-matching-overload]
--> torch/jit/frontend.py:118:25
|
118 | pretty_node_names.update(
| _________________________^
119 | | {
120 | | ast.AsyncFunctionDef: "async function definitions",
121 | | ast.AsyncFor: "async for loops",
122 | | ast.AsyncWith: "async with statements",
123 | | ast.Try: "try blocks",
| |_______________________________^
|
Possible overloads:
(m: SupportsKeysAndGetItem[type[Assert] | type[Break] | type[ClassDef] | type[Continue] | type[Delete] | type[For] | type[FunctionDef] | type[Global] | type[Import] | type[ImportFrom] | type[Raise] | type[With], str], /) -> None [closest match]
(m: SupportsKeysAndGetItem[str, str], /, **kwargs: str) -> None
(m: Iterable[tuple[type[Assert] | type[Break] | type[ClassDef] | type[Continue] | type[Delete] | type[For] | type[FunctionDef] | type[Global] | type[Import] | type[ImportFrom] | type[Raise] | type[With], str]], /) -> None
(m: Iterable[tuple[str, str]], /, **kwargs: str) -> None
(**kwargs: str) -> None
ERROR No matching overload found for function `typing.MutableMapping.update` [no-matching-overload]
--> torch/jit/frontend.py:128:25
|
128 | node_start_tokens.update(
| _________________________^
129 | | {
130 | | ast.AsyncFunctionDef: "async def",
131 | | ast.AsyncFor: "async for",
132 | | ast.AsyncWith: "async with",
133 | | ast.Try: "try",
| |________________________^
|
Possible overloads:
(m: SupportsKeysAndGetItem[type[Assert] | type[Break] | type[ClassDef] | type[Continue] | type[Delete] | type[For] | type[FunctionDef] | type[Global] | type[Import] | type[ImportFrom] | type[Raise] | type[With], str], /) -> None [closest match]
(m: SupportsKeysAndGetItem[str, str], /, **kwargs: str) -> None
(m: Iterable[tuple[type[Assert] | type[Break] | type[ClassDef] | type[Continue] | type[Delete] | type[For] | type[FunctionDef] | type[Global] | type[Import] | type[ImportFrom] | type[Raise] | type[With], str]], /) -> None
(m: Iterable[tuple[str, str]], /, **kwargs: str) -> None
(**kwargs: str) -> None
ERROR No matching overload found for function `typing.MutableMapping.update` [no-matching-overload]
--> torch/jit/frontend.py:138:25
|
138 | pretty_node_names.update(
| _________________________^
139 | | {
140 | | ast.AnnAssign: "annotated assignments",
141 | | }
142 | | )
| |_^
|
Possible overloads:
(m: SupportsKeysAndGetItem[type[Assert] | type[Break] | type[ClassDef] | type[Continue] | type[Delete] | type[For] | type[FunctionDef] | type[Global] | type[Import] | type[ImportFrom] | type[Raise] | type[With], str], /) -> None [closest match]
(m: SupportsKeysAndGetItem[str, str], /, **kwargs: str) -> None
(m: Iterable[tuple[type[Assert] | type[Break] | type[ClassDef] | type[Continue] | type[Delete] | type[For] | type[FunctionDef] | type[Global] | type[Import] | type[ImportFrom] | type[Raise] | type[With], str]], /) -> None
(m: Iterable[tuple[str, str]], /, **kwargs: str) -> None
(**kwargs: str) -> None
ERROR Cannot set item in `dict[type[Add] | type[BitAnd] | type[BitOr] | type[BitXor] | type[Div] | type[FloorDiv] | type[LShift] | type[Mod] | type[Mult] | type[Pow] | type[RShift] | type[Sub], str]` [unsupported-operation]
--> torch/jit/frontend.py:862:15
|
862 | binop_map[ast.MatMult] = "@"
| ^^^^^^^^^^^
|
Argument `type[MatMult]` is not assignable to parameter `key` with type `type[Add] | type[BitAnd] | type[BitOr] | type[BitXor] | type[Div] | type[FloorDiv] | type[LShift] | type[Mod] | type[Mult] | type[Pow] | type[RShift] | type[Sub]` in function `dict.__setitem__`
ERROR `+=` is not supported between `str` and `EllipsisType` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR `+=` is not supported between `str` and `bool` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR `+=` is not supported between `str` and `bytes` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR `+=` is not supported between `str` and `complex` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR `+=` is not supported between `str` and `float` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR `+=` is not supported between `str` and `int` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR `+=` is not supported between `str` and `None` [unsupported-operation]
--> torch/jit/frontend.py:1223:17
|
1223 | s += value.value
| ^^^^^^^^^^^^^^^^
|
No matching overload found for function `str.__add__`
Possible overloads:
(value: LiteralString, /) -> LiteralString
(value: str, /) -> str [closest match]
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:47:67
|
47 | cpp_module = torch._C._load_for_lite_interpreter(os.fspath(f), map_location)
| ^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/mobile/__init__.py:50:13
|
50 | f.read(), map_location
| ^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:106:62
|
106 | return torch._C._get_model_bytecode_version(os.fspath(f_input))
| ^^^^^^^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/mobile/__init__.py:108:65
|
108 | return torch._C._get_model_bytecode_version_from_buffer(f_input.read())
| ^^^^^^^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:138:68
|
138 | return torch._C._get_mobile_model_contained_types(os.fspath(f_input))
| ^^^^^^^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/mobile/__init__.py:140:71
|
140 | return torch._C._get_mobile_model_contained_types_from_buffer(f_input.read())
| ^^^^^^^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:164:22
|
164 | os.fspath(f_input), os.fspath(f_output), to_version
| ^^^^^^^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:164:42
|
164 | os.fspath(f_input), os.fspath(f_output), to_version
| ^^^^^^^^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Object of class `PathLike` has no attribute `read`
Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/mobile/__init__.py:168:13
|
168 | f_input.read(), str(f_output), to_version
| ^^^^^^^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:187:65
|
187 | return torch._C._backport_for_mobile_to_buffer(os.fspath(f_input), to_version)
| ^^^^^^^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/mobile/__init__.py:190:13
|
190 | f_input.read(), to_version
| ^^^^^^^^^^^^
|
ERROR No matching overload found for function `os.fspath` [no-matching-overload]
--> torch/jit/mobile/__init__.py:230:58
|
230 | return torch._C._get_model_ops_and_info(os.fspath(f_input))
| ^^^^^^^^^
|
Possible overloads:
(path: str) -> str [closest match]
(path: bytes) -> bytes
(path: PathLike[AnyStr]) -> AnyStr
ERROR Object of class `str` has no attribute `read` [missing-attribute]
--> torch/jit/mobile/__init__.py:232:49
|
232 | return torch._C._get_model_ops_and_info(f_input.read())
| ^^^^^^^^^^^^
|
ERROR Argument `str` is not assignable to parameter `object` with type `LiteralString` in function `list.append` [bad-argument-type]
--> torch/jit/supported_ops.py:264:35
|
264 | magic_methods_rows.append(f'"{fn}", "``{magic_method}``"')
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `str` is not assignable to parameter `object` with type `LiteralString` in function `list.append` [bad-argument-type]
--> torch/jit/supported_ops.py:282:35
|
282 | schemaless_ops.append(table_row)
| ^^^^^^^^^
|
ERROR No matching overload found for function `max` [no-matching-overload]
--> torch/optim/_muon.py:81:39
|
81 | adjusted_ratio = math.sqrt(max(1, A / B))
| ^^^^^^^^^^
|
Possible overloads:
(arg1: SupportsRichComparisonT, arg2: SupportsRichComparisonT, /, *_args: SupportsRichComparisonT, *, key: None = None) -> SupportsRichComparisonT [closest match]
(arg1: _T, arg2: _T, /, *_args: _T, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None) -> SupportsRichComparisonT
(iterable: Iterable[_T], /, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None, default: _T) -> SupportsRichComparisonT | _T
(iterable: Iterable[_T1], /, *, key: (_T1) -> SupportsDunderGT[Any] | SupportsDunderLT[Any], default: _T2) -> _T1 | _T2
ERROR Argument `Tensor` is not assignable to parameter `alpha` with type `bool | complex | float | int | None` in function `torch._C.TensorBase.add` [bad-argument-type]
--> torch/optim/adam.py:418:54
|
418 | grad = grad.add(param, alpha=weight_decay)
| ^^^^^^^^^^^^
|
ERROR No matching overload found for function `torch._C.TensorBase.lerp_` [no-matching-overload]
--> torch/optim/adam.py:447:22
|
447 | exp_avg.lerp_(grad, 1 - device_beta1)
| ^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(end: Tensor, weight: Tensor) -> Tensor [closest match]
(end: Tensor, weight: bool | complex | float | int) -> Tensor
ERROR No matching overload found for function `torch._C._VariableFunctions._foreach_mul_` [no-matching-overload]
--> torch/optim/adam.py:695:28
|
695 | torch._foreach_mul_(device_exp_avg_sqs, beta2)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(self: list[Tensor] | tuple[Tensor, ...] | None, scalars: Sequence[bool | complex | float | int]) -> None [closest match]
(self: list[Tensor] | tuple[Tensor, ...] | None, other: Tensor) -> None
(self: list[Tensor] | tuple[Tensor, ...] | None, scalar: bool | complex | float | int) -> None
(self: list[Tensor] | tuple[Tensor, ...] | None, other: list[Tensor] | tuple[Tensor, ...] | None) -> None
ERROR `**` is not supported between `Tensor` and `float` [unsupported-operation]
--> torch/optim/asgd.py:266:29
|
266 | eta.copy_(lr / ((1 + lambd * lr * step_t) ** alpha))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Argument `float` is not assignable to parameter with type `TensorBase`
ERROR `**` is not supported between `Tensor` and `float` [unsupported-operation]
--> torch/optim/asgd.py:266:29
|
266 | eta.copy_(lr / ((1 + lambd * lr * step_t) ** alpha))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Expected 1 more positional argument
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/optim/lbfgs.py:116:13
|
116 | bracket[0],
| ^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/optim/lbfgs.py:119:13
|
119 | bracket[1],
| ^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot set item in `Iterable[Unknown]` [unsupported-operation]
--> torch/optim/lbfgs.py:154:13
|
154 | bracket[high_pos] = t
| ^^^^^^^^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__setitem__`
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/optim/lbfgs.py:163:29
|
163 | elif gtd_new * (bracket[high_pos] - bracket[low_pos]) >= 0:
| ^^^^^^^^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot index into `Iterable[Unknown]` [index-error]
--> torch/optim/lbfgs.py:163:49
|
163 | elif gtd_new * (bracket[high_pos] - bracket[low_pos]) >= 0:
| ^^^^^^^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__getitem__`
ERROR Cannot set item in `Iterable[Unknown]` [unsupported-operation]
--> torch/optim/lbfgs.py:165:17
|
165 | bracket[high_pos] = bracket[low_pos]
| ^^^^^^^^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__setitem__`
ERROR `Iterable[Unknown]` is not subscriptable [unsupported-operation]
--> torch/optim/lbfgs.py:165:37
|
165 | bracket[high_pos] = bracket[low_pos]
| ^^^^^^^^^^^^^^^^
|
ERROR Cannot set item in `Iterable[Unknown]` [unsupported-operation]
--> torch/optim/lbfgs.py:171:13
|
171 | bracket[low_pos] = t
| ^^^^^^^^^^^^^^^^
|
Object of class `Iterable` has no attribute `__setitem__`
ERROR `int` is not assignable to attribute `_numel_cache` with type `None` [bad-assignment]
--> torch/optim/lbfgs.py:255:33
|
255 | self._numel_cache = sum(
| _________________________________^
256 | | 2 * p.numel() if torch.is_complex(p) else p.numel()
257 | | for p in self._params
258 | | )
| |_____________^
|
ERROR `float` is not assignable to attribute `default_min_lr` with type `None` [bad-assignment]
--> torch/optim/lr_scheduler.py:1668:35
|
1668 | self.default_min_lr = min_lr
| ^^^^^^
|
ERROR `list[Never]` is not assignable to attribute `min_lrs` with type `Never` [bad-assignment]
--> torch/optim/lr_scheduler.py:1727:32
|
1727 | self.min_lrs = [self.default_min_lr] * len(self.optimizer.param_groups)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR `float` is not assignable to variable `step_size_up` with type `int` [bad-assignment]
--> torch/optim/lr_scheduler.py:1906:24
|
1906 | step_size_up = float(step_size_up)
| ^^^^^^^^^^^^^^^^^^^
|
ERROR `float | int` is not assignable to variable `step_size_down` with type `int | None` [bad-assignment]
--> torch/optim/lr_scheduler.py:1908:13
|
1908 | float(step_size_down) if step_size_down is not None else step_size_up
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
ERROR `+` is not supported between `int` and `None` [unsupported-operation]
--> torch/optim/lr_scheduler.py:1910:27
|
1910 | self.total_size = step_size_up + step_size_down
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Argument `None` is not assignable to parameter `value` with type `int` in function `int.__add__`
ERROR `Args[_P]` is not subscriptable [unsupported-operation]
--> torch/optim/optimizer.py:65:32
|
65 | self = cast(Optimizer, args[0]) # assume first positional arg is `self`
| ^^^^^^^
|
ERROR `Args[_P]` is not subscriptable [unsupported-operation]
--> torch/optim/optimizer.py:138:29
|
138 | and (arg := args[state_steps_ind])
| ^^^^^^^^^^^^^^^^^^^^^
|
ERROR `Kwargs[_P]` is not subscriptable [unsupported-operation]
--> torch/optim/optimizer.py:143:35
|
143 | and (kwarg := kwargs["state_steps"])
| ^^^^^^^^^^^^^^^^^^^^^
|
ERROR Expected a type form, got instance of `Literal['Optimizer']` [not-a-type]
--> torch/optim/optimizer.py:362:66
|
362 | _optimizer_state_dict_pre_hooks: 'OrderedDict[int, Callable[["Optimizer"], None]]'
| ^^^^^^^^^^^
|
ERROR Expected a type form, got instance of `Literal['Optimizer']` [not-a-type]
--> torch/optim/optimizer.py:364:37
|
364 | 'OrderedDict[int, Callable[["Optimizer", StateDict], Optional[StateDict]]]'
| ^^^^^^^^^^^
|
ERROR Expected a type form, got instance of `Literal['Optimizer']` [not-a-type]
--> torch/optim/optimizer.py:367:37
|
367 | 'OrderedDict[int, Callable[["Optimizer", StateDict], Optional[StateDict]]]'
| ^^^^^^^^^^^
|
ERROR Expected a type form, got instance of `Literal['Optimizer']` [not-a-type]
--> torch/optim/optimizer.py:370:37
|
370 | 'OrderedDict[int, Callable[["Optimizer"], None]]'
| ^^^^^^^^^^^
|
ERROR No matching overload found for function `list.__init__` [no-matching-overload]
--> torch/optim/optimizer.py:394:28
|
394 | param_groups = list(params)
| ^^^^^^^^
|
Possible overloads:
() -> None [closest match]
(iterable: Iterable[_T], /) -> None
ERROR Expected *-unpacked _P.args and **-unpacked _P.kwargs [invalid-param-spec]
--> torch/optim/optimizer.py:517:27
|
517 | out = func(*args, **kwargs)
| ^^^^^^^^^^^^^^^^^
|
ERROR Argument `Unknown | None` is not assignable to parameter `param_id` with type `int` in function `Optimizer._process_value_according_to_param_policy` [bad-argument-type]
--> torch/optim/optimizer.py:952:35
|
952 | param, value, param_id, param_groups, key
| ^^^^^^^^
|
ERROR Argument `Unknown | None` is not assignable to parameter `param_groups` with type `list[dict[Any, Any]]` in function `Optimizer._process_value_according_to_param_policy` [bad-argument-type]
--> torch/optim/optimizer.py:952:45
|
952 | param, value, param_id, param_groups, key
| ^^^^^^^^^^^^
|
ERROR Expected 0 positional arguments, got 1 in function `object.__init__` [bad-argument-count]
--> torch/optim/optimizer.py:963:21
|
963 | / _cast(param, v, param_id=param_id, param_groups=param_groups)
964 | | for v in value
| |__________________________________^
|
ERROR `**` is not supported between `Tensor` and `float` [unsupported-operation]
--> torch/optim/radam.py:325:20
|
325 | return (
| ____________________^
326 | | (rho_t - 4)
327 | | * (rho_t - 2)
328 | | * rho_inf
329 | | / ((rho_inf - 4) * (rho_inf - 2) * rho_t)
330 | | ) ** 0.5
| |____________________^
|
Argument `float` is not assignable to parameter with type `TensorBase`
ERROR `**` is not supported between `Tensor` and `float` [unsupported-operation]
--> torch/optim/radam.py:325:20
|
325 | return (
| ____________________^
326 | | (rho_t - 4)
327 | | * (rho_t - 2)
328 | | * rho_inf
329 | | / ((rho_inf - 4) * (rho_inf - 2) * rho_t)
330 | | ) ** 0.5
| |____________________^
|
Expected 1 more positional argument
ERROR `**` is not supported between `Tensor` and `float` [unsupported-operation]
--> torch/optim/radam.py:339:21
|
339 | return (bias_correction2**0.5) / exp_avg_sq_sqrt
| ^^^^^^^^^^^^^^^^^^^^^
|
Argument `float` is not assignable to parameter with type `TensorBase`
ERROR `**` is not supported between `Tensor` and `float` [unsupported-operation]
--> torch/optim/radam.py:339:21
|
339 | return (bias_correction2**0.5) / exp_avg_sq_sqrt
| ^^^^^^^^^^^^^^^^^^^^^
|
Expected 1 more positional argument
ERROR `Literal[0] | Tensor | float` is not assignable to `float` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/optim/sgd.py:340:5
|
340 | / for i, param in enumerate(params):
341 | | grad = grads[i] if not maximize else -grads[i]
342 | |
343 | | if weight_decay != 0:
344 | | # Nested if is necessary to bypass jitscript rules
345 | | if isinstance(weight_decay, Tensor):
| |_________________________________________________^
|
ERROR Argument `Tensor` is not assignable to parameter `alpha` with type `bool | complex | float | int | None` in function `torch._C.TensorBase.add` [bad-argument-type]
--> torch/optim/sgd.py:350:50
|
350 | grad = grad.add(param, alpha=weight_decay)
| ^^^^^^^^^^^^
|
ERROR Argument `Tensor` is not assignable to parameter `alpha` with type `bool | complex | float | int | None` in function `torch._C.TensorBase.add_` [bad-argument-type]
--> torch/optim/sgd.py:373:40
|
373 | param.add_(grad, alpha=-lr)
| ^^^
|
ERROR Cannot index into `Sized` [index-error]
--> torch/optim/sgd.py:433:20
|
433 | if device_momentum_buffer_list[i] is None:
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Object of class `Sized` has no attribute `__getitem__`
ERROR Cannot index into `Sized` [index-error]
--> torch/optim/sgd.py:437:46
|
437 | bufs.append(cast(Tensor, device_momentum_buffer_list[i]))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Object of class `Sized` has no attribute `__getitem__`
ERROR `Sized | list[Tensor | None]` is not assignable to `list[Tensor | None]` (caused by inconsistent types when breaking cycles) [bad-assignment]
--> torch/optim/sgd.py:444:17
|
444 | / for i in range(len(device_momentum_buffer_list)):
445 | | if device_momentum_buffer_list[i] is None:
446 | | buf = device_momentum_buffer_list[i] = momentum_buffer_list[
447 | | indices[i]
448 | | ] = device_grads[i].detach().clone()
449 | | else:
| |__________________________^
|
ERROR Cannot index into `Sized` [index-error]
--> torch/optim/sgd.py:445:24
|
445 | if device_momentum_buffer_list[i] is None:
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Object of class `Sized` has no attribute `__getitem__`
ERROR Cannot index into `Sized` [index-error]
--> torch/optim/sgd.py:450:44
|
450 | buf = cast(Tensor, device_momentum_buffer_list[i])
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Object of class `Sized` has no attribute `__getitem__`
ERROR Argument `Iterator[Tensor]` is not assignable to parameter `*iterables` with type `Iterable[Parameter]` in function `itertools.chain.__new__` [bad-argument-type]
--> torch/optim/swa_utils.py:252:55
|
252 | itertools.chain(self.module.parameters(), self.module.buffers())
| ^^^^^^^^^^^^^^^^^^^^^
|
ERROR Argument `Iterator[Tensor]` is not assignable to parameter `*iterables` with type `Iterable[Parameter]` in function `itertools.chain.__new__` [bad-argument-type]
--> torch/optim/swa_utils.py:257:49
|
257 | itertools.chain(model.parameters(), model.buffers())
| ^^^^^^^^^^^^^^^
|
ERROR Object of class `NoneType` has no attribute `device` [missing-attribute]
--> torch/optim/swa_utils.py:303:53
|
303 | n_averaged = self.n_averaged.to(p_averaged.device)
| ^^^^^^^^^^^^^^^^^
|
ERROR Object of class `NoneType` has no attribute `detach` [missing-attribute]
--> torch/optim/swa_utils.py:304:21
|
304 | p_averaged.detach().copy_(
| ^^^^^^^^^^^^^^^^^
|
ERROR Object of class `NoneType` has no attribute `detach` [missing-attribute]
--> torch/optim/swa_utils.py:305:37
|
305 | self.avg_fn(p_averaged.detach(), p_model, n_averaged)
| ^^^^^^^^^^^^^^^^^
|
ERROR Argument `Tensor | None` is not assignable to parameter with type `Tensor` [bad-argument-type]
--> torch/optim/swa_utils.py:305:58
|
305 | self.avg_fn(p_averaged.detach(), p_model, n_averaged)
| ^^^^^^^
|
ERROR No matching overload found for function `min` [no-matching-overload]
--> torch/optim/swa_utils.py:492:28
|
492 | prev_t = max(0, min(1, (step - 1) / max(1, self.anneal_epochs)))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(arg1: SupportsRichComparisonT, arg2: SupportsRichComparisonT, /, *_args: SupportsRichComparisonT, *, key: None = None) -> SupportsRichComparisonT [closest match]
(arg1: _T, arg2: _T, /, *_args: _T, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None) -> SupportsRichComparisonT
(iterable: Iterable[_T], /, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None, default: _T) -> SupportsRichComparisonT | _T
(iterable: Iterable[_T1], /, *, key: (_T1) -> SupportsDunderGT[Any] | SupportsDunderLT[Any], default: _T2) -> _T1 | _T2
ERROR No matching overload found for function `min` [no-matching-overload]
--> torch/optim/swa_utils.py:498:23
|
498 | t = max(0, min(1, step / max(1, self.anneal_epochs)))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
Possible overloads:
(arg1: SupportsRichComparisonT, arg2: SupportsRichComparisonT, /, *_args: SupportsRichComparisonT, *, key: None = None) -> SupportsRichComparisonT [closest match]
(arg1: _T, arg2: _T, /, *_args: _T, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None) -> SupportsRichComparisonT
(iterable: Iterable[_T], /, *, key: (_T) -> SupportsDunderGT[Any] | SupportsDunderLT[Any]) -> _T
(iterable: Iterable[SupportsRichComparisonT], /, *, key: None = None, default: _T) -> SupportsRichComparisonT | _T
(iterable: Iterable[_T1], /, *, key: (_T1) -> SupportsDunderGT[Any] | SupportsDunderLT[Any], default: _T2) -> _T1 | _T2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment