Skip to content

Commit

Permalink
• autodiffcomposition.py
Browse files Browse the repository at this point in the history
  - infer_backpropagation_learning_pathways(): add NodeRole.BIAS to pathways consructed for learning
  • Loading branch information
jdcpni committed Nov 21, 2024
1 parent 9a92fdc commit 9c4a84b
Show file tree
Hide file tree
Showing 4 changed files with 20 additions and 10 deletions.
3 changes: 2 additions & 1 deletion psyneulink/core/components/ports/inputport.py
Original file line number Diff line number Diff line change
Expand Up @@ -713,7 +713,8 @@ class InputPort(Port_Base):
is executed and its variable is assigned None. If *default_input* is assigned *DEFAULT_VARIABLE*, then the
`default value <Parameter_Defaults>` for the InputPort's `variable <InputPort.variable>` is used as its value.
This is useful for assignment to a Mechanism that needs a constant (i.e., fixed value) as the input to its
`function <Mechanism_Base.function>`.
`function <Mechanism_Base.function>` (such as a `bias unit <AutodiffComposition_Bias_Parameters>` in an
`AutodiffComposition`).
.. note::
If `default_input <InputPort.default_input>` is assigned *DEFAULT_VARIABLE*, then its `internal_only
Expand Down
2 changes: 1 addition & 1 deletion psyneulink/core/compositions/composition.py
Original file line number Diff line number Diff line change
Expand Up @@ -3369,7 +3369,7 @@ class NodeRole(enum.Enum):
BIAS
A `Node <Composition_Nodes>` for which one or more of its `InputPorts <InputPort>` is assigned
*DEFAULT_VARIABLE* as its `default_input <InputPort.default_input>` (which provides it a prespecified
input that is constant across executions). Such a node can also be assigned as an `INPUT` and/or `ORIGIN`,
input that is constant across executions). Such a node can also be assigned as an `INPUT` and/or `ORIGIN`,
if it receives input from outside the Composition and/or does not receive any `Projections <Projection>` from
other Nodes within the Composition, respectively. This role cannot be modified programmatically.

Expand Down
20 changes: 14 additions & 6 deletions psyneulink/library/compositions/autodiffcomposition.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,10 +110,17 @@
AutodiffComposition does not (currently) support the *automatic* construction of separate bias parameters.
Thus, when constructing a model using an AutodiffComposition that corresponds to one in PyTorch, the `bias
<https://www.pytorch.org/docs/stable/nn.html#torch.nn.Module>` parameter of PyTorch modules should be set
to `False`. Trainable biases *can* be specified explicitly in an AutodiffComposition by including a
TransferMechanism that projects to the relevant Mechanism (i.e., implementing that layer of the network to
receive the biases) using a `MappingProjection` with a `matrix <MappingProjection.matrix>` parameter that
implements a diagnoal matrix with values corresponding to the initial value of the biases.
to `False`.
.. hint::
Trainable biases *can* be specified explicitly in an AutodiffComposition by including a `ProcessingMechanism`
that projects to the relevant Mechanism (i.e., implementing that layer of the network to receive the biases)
using a `MappingProjection` with a `matrix <MappingProjection.matrix>` parameter that implements a diagnoal
matrix with values corresponding to the initial value of the biases, and setting the `default_input
<InputPort.default_input>` Parameter of one of the ProcessingMechanism's `input_ports
<Mechanism_Base.input_ports>` to *DEFAULT_VARIABLE*, and its `default_variable <Component.default_variable>`
equal to 1. ProcessingMechanisms configured in this way are assigned `NodeRole` `BIAS`, and the MappingProjection
is subject to learning.
.. _AutodiffComposition_Nesting:
Expand Down Expand Up @@ -951,8 +958,9 @@ def create_pathway(node)->list:

return pathways

# Construct a pathway for each INPUT Node (except the TARGET Node)
pathways = [pathway for node in self.get_nodes_by_role(NodeRole.INPUT)
# Construct a pathway for each INPUT Node (including BIAS Nodes), except the TARGET Node)
pathways = [pathway
for node in (self.get_nodes_by_role(NodeRole.INPUT) + self.get_nodes_by_role(NodeRole.BIAS))
if node not in self.get_nodes_by_role(NodeRole.TARGET)
for pathway in _get_pytorch_backprop_pathway(node)]

Expand Down
5 changes: 3 additions & 2 deletions tests/composition/test_emcomposition.py
Original file line number Diff line number Diff line change
Expand Up @@ -854,7 +854,8 @@ def test_backpropagation_of_error_in_learning(self):
function=pnl.Tanh,
integrator_mode=True,
integration_rate=.69)
em = EMComposition(memory_template=[[0] * 11, [0] * 11, [0] * 11], # context
em = EMComposition(name='EM',
memory_template=[[0] * 11, [0] * 11, [0] * 11], # context
memory_fill=(0,.0001),
memory_capacity=50,
memory_decay_rate=0,
Expand Down Expand Up @@ -1046,7 +1047,7 @@ def test_backpropagation_of_error_in_learning(self):
# [0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
# [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0],
# [0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0]]

#
# fig, axes = plt.subplots(3, 1, figsize=(5, 12))
# axes[0].imshow(EGO.projections[7].parameters.matrix.get(EGO.name), interpolation=None)
# axes[1].plot((1 - np.abs(EGO.results[1:50,2]-TARGETS[:49])).sum(-1))
Expand Down

0 comments on commit 9c4a84b

Please sign in to comment.