home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!gatech!purdue!mentor.cc.purdue.edu!noose.ecn.purdue.edu!lips.ecn.purdue.edu!kavuri
- From: kavuri@lips.ecn.purdue.edu (Surya N Kavuri )
- Newsgroups: comp.ai.neural-nets
- Subject: Re: Network Inversion
- Message-ID: <1992Aug23.181041.22790@noose.ecn.purdue.edu>
- Date: 23 Aug 92 18:10:41 GMT
- References: <BtCqHo.AtH.1@cs.cmu.edu> <arms.714522787@spedden>
- Sender: news@noose.ecn.purdue.edu (USENET news)
- Organization: Purdue University Engineering Computer Network
- Lines: 52
-
- In article <arms.714522787@spedden>, arms@cs.UAlberta.CA (Bill Armstrong) writes:
- > tjochem+@CS.CMU.EDU (Todd Jochem) writes:
- >
- > >I'm looking for references to network inversion papers. The basic idea
- > >I'm interested in is presenting the network's output with a signal and
- > >by back-propagating this signal though the layer(s), recreating an input
- > >which could have created the applied output signal.
- >
- > This is intractable for a general multi-layer perceptron.
- >
- > Proof: We have to show it for the special case of ALNs, which are
- > trees of nodes realizing AND, OR, LEFT and RIGHT functions, with
- > leaves connected to input bits and complements. The same input bit or
- > complement may be sent to many leaves. It will be enough to show it
- > for a two-layer tree, with ORs feeding into an AND. Back-propagating
- > a 1 signal is just CNF-satisfiability, and is NP-complete.
- >
- > Comment: the difficulty arises because back-propagating signals often
- > converge at the same node, either at the inputs (ALNs) or in hidden
- > layers (MLPs), and they are likely to have contradictory values when
- > they do.
- >
- > --
- > ***************************************************
- > Prof. William W. Armstrong, Computing Science Dept.
-
- May be there are easier ways of explaining this :-)
-
-
-
- Inversion of a network may not return the exact input (that was
- perhaps used in the training) for an MLP. This is because the
- function developed by the network may not be invertible. This
- of course means that there are many possible inputs that
- can give you the same output. Use gradient descent search to
- find any one of them or use genetic type multiple point
- search techniques to find more than one.
- In general, the input you determined may not give the exact
- ouput you have (may be a local minima or may be the output
- can't be generated by your network). So, in general you
- have no guarantee to find the inversion.
-
- Yo can see this another way:
- Since each node maps many inputs to a single value, you may be
- able to find many inputs which give you the same node value.
-
-
-
-
-
-
- Surya Kavuri
-