Skip to content

Updated ONNX Parser#1

Open
samysweb wants to merge 6 commits intodevfrom
feat/nn-loader
Open

Updated ONNX Parser#1
samysweb wants to merge 6 commits intodevfrom
feat/nn-loader

Conversation

@samysweb
Copy link
Copy Markdown
Owner

  • Dense
  • ReLU
  • AddConst

@samysweb samysweb changed the title [DRAFT] Updated ONNX Parser Updated ONNX Parser Feb 23, 2026
@samysweb samysweb requested review from Copilot and phK3 February 23, 2026 13:34
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request updates the ONNX parser to support Dense, ReLU, and AddConst layer types, representing a significant architectural refactoring of the network representation system. The changes migrate from a simple Layer-based structure to a more sophisticated ONNX-based node representation with proper topological sorting and I/O mapping.

Changes:

  • Migrated from Network/Layer types to OnnxNet/Node types with support for ONNXLinear, ONNXRelu, and ONNXAddConst layers
  • Added support for AddConst layers throughout the propagation and initialization pipeline
  • Refactored network construction to use topological sorting and proper I/O mapping
  • Updated test infrastructure to support the new layer types with comprehensive test coverage

Reviewed changes

Copilot reviewed 25 out of 27 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
test/unit/propagation/utils.jl Refactored utility functions to create OnnxNet networks with support for ReLU and AddConst layers; added mutation synchronization
test/unit/propagation/relu_addConst.jl New comprehensive test suite for ReLU + AddConst + Dense layer propagation
test/unit/propagation/relu.jl Updated to use new executable_network pattern and removed partial network testing
test/unit/propagation/main.jl Re-enabled previously commented test includes
test/unit/propagation/dense_zero_diff.jl Updated to use depth-based loop structure and executable_network pattern
test/unit/propagation/dense.jl Updated to use depth-based loop structure and executable_network pattern
test/runtests.jl Added imports for new ONNX types and executable_network function
test/compare_fuzz.jl New fuzzing framework for comparing Julia and PyTorch implementations
src/VeryDiff.jl Removed outdated Util.jl include
src/Verifier.jl Updated to use OnnxNet types and added Softmax removal logic
src/Util.jl Deleted legacy utility file
src/Transformers/Util.jl Updated initialization functions to use output dimensions instead of layer objects
src/Transformers/Transformers.jl Added VNNLib imports for ONNX types
src/Transformers/Single_Transformers.jl Implemented propagation for ONNXLinear, ONNXAddConst, and ONNXRelu
src/Transformers/Network.jl Refactored propagation to support sparse storage and multiple outputs
src/Transformers/Init.jl Added initialization for ONNXAddConst and updated for new ONNX types
src/Transformers/Diff_Transformers.jl Implemented differential propagation for ONNXLinear and ONNXAddConst
src/Properties/Top1.jl Updated to assume Softmax is pre-applied in network outputs
src/Definitions/PropState.jl Renamed get_layer_inputs to get_zonos_at_pos and updated storage logic
src/Definitions/Network.jl Major refactoring: added topological sorting, I/O mapping, and OnnxLayer/DiffLayer structures
src/Definitions/Definitions.jl Updated exports for new types and functions
src/Definitions/AbstractDomains.jl Changed ZonotopeStorage to support Union{Nothing,CachedZonotope}
src/Cli.jl Updated to use load_onnx_model instead of load_network
deps/sysimage/trace_run.jl Added additional test case for precompilation
Project.toml Added VNNLib version constraint (0.3.0)
Manifest.toml Updated dependencies to Julia 1.11.8 and various package updates
.github/workflows/tests.yml Updated VNNLib revision in CI workflow
Comments suppressed due to low confidence (8)

src/Transformers/Init.jl:127

  • The assertion message on line 127 incorrectly states "Dense DiffLayer" when this function is for ONNXAddConst layers. The message should be updated to accurately reflect the layer type being validated (e.g., "AddConst DiffLayer").
    @assert length(inputs) == 1 "Dense DiffLayer should have exactly one input"

test/unit/propagation/utils.jl:13

  • The variable layer_metadata is initialized on line 13 but never actually used. It appears to be leftover from development and should be removed to avoid confusion.
    layer_metadata = Vector{Tuple{String, Int}}()  # Track (layer_name, mutation_type) for synchronization

src/Transformers/Network.jl:20

  • The output_positions variable is assigned twice (lines 9 and 20), with the second assignment being redundant since get_outputs(diff_layer) returns the same value. The line 20 assignment should be removed to avoid confusion and unnecessary computation.
        output_positions = get_outputs(diff_layer)

src/Transformers/Diff_Transformers.jl:63

  • The assertion message on line 63 incorrectly states "Dense layer" when this function is for ONNXAddConst layers. The message should be updated to accurately reflect the layer type being validated.
    @assert length(inputs) == 1 "Dense layer should have exactly one input zonotope"

src/Verifier.jl:51

  • The check for ONNXSoftmax at line 51 uses a complex type pattern that may not work correctly due to the parameterization. The where S clause makes this check less reliable. Consider checking the type more explicitly or using a helper function to determine if the last layer is a Softmax layer.
    if N.diff_layers[end] isa VeryDiff.Definitions.DiffLayer{VNNLib.OnnxParser.ONNXSoftmax{S},VNNLib.OnnxParser.ONNXSoftmax{S},VNNLib.OnnxParser.ONNXSoftmax{S}} where S

test/unit/propagation/relu_addConst.jl:48

  • Accessing prop_state.zono_storage.zonotopes[end] may fail if the storage is sparse (contains nothing values). The code should verify that the last element is not nothing before accessing its .zonotope field, or should find the last non-nothing element.
        Zout = prop_state.zono_storage.zonotopes[end].zonotope

src/Transformers/Diff_Transformers.jl:67

  • The variable i_out is initialized to 1 on line 67 but is immediately overwritten in the for loop on line 70. This initialization is redundant and should be removed.
    i_out = 1

src/Definitions/Network.jl:42

  • The ZeroDense struct is parameterized with LayerIdT but doesn't have any fields. This creates an ambiguity - the struct can't store or track any layer-specific information. Consider either removing the type parameter if it's not needed, or adding fields to store necessary metadata about the layer.
struct ZeroDense{LayerIdT} <: Node{LayerIdT}
end

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants