Skip to content

Conversation

@tz545
Copy link

@tz545 tz545 commented Jun 11, 2025

In the HyperGat paper, the second attention calculation should compute the edge-level attention. The attention function in hypergat_layer.py computes a node-level attention by default, so here the mechanism "edge_level" should be explicitly specified on line 203.

Additionally, in line 116 in the attention function, I believe self.target_index_i $\in [0, N]$ should be changed to self.source_index_j $\in [0, E]$ in node-level attention (where $N$ is the number of nodes and $E$ is the number of hyperedges), as the input for node-level attention has dimensions (E, h_dim).

Similarly, line 125 should be changed to index a node, and line 129 should index an edge.

@codecov
Copy link

codecov bot commented Jun 11, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.37%. Comparing base (fd6ad4f) to head (aa9ecd2).

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #337   +/-   ##
=======================================
  Coverage   97.37%   97.37%           
=======================================
  Files          58       58           
  Lines        2060     2060           
=======================================
  Hits         2006     2006           
  Misses         54       54           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant