Jump to content

A while ago there was an episode on fully homomorphic encryption and here is another look into th...


G+_Bill Mattheis
 Share

Recommended Posts

fhe-diagram.png

A while ago there was an episode on fully homomorphic encryption and here is another look into this interesting problem. Highly recommend checking out John's other interesting posts if you are a math buff. 

 

Originally shared by John Baez

 

Could category theory help save privacy?

 

Jeremy Kun has a great blog article about encryption and privacy:

 

http://jeremykun.com/2013/06/10/why-theoretical-computer-scientists-arent-worried-about-privacy/

 

He quotes Obama as saying:

 

You can’t have 100% security and 100% privacy, and also zero inconvenience.

 

But he argues against this, pointing out:

 

There are two facts which are well known in theoretical computer science that the general public is not aware of. The first is about the privacy of databases:

 

There is a way to mine information from a database without the ability to inspect individual entries in the database.

 

This is known as differential privacy. The second is no less magical:

 

There are secure encryption schemes which allow one to run programs on encrypted data and produce encrypted results, without the ability to decrypt the data.

 

This is known as fully homomorphic encryption.

 

These ideas are exciting for lots of practical reasons.   The second one means that big companies and governments can do useful things with encrypted data you send them, and then send back the results for you to see, encoded in a way only you can decode... without them being able to know what your data was! 

 

But as a mathematician, I really love how this is an example of category theory in action! 

 

The funny diagram below expresses the key idea.  We start with a message m and encrypt it, getting the encrypted message enc(m).   This process is drawn as the vertical arrow at left, labelled enc.  Arrows are processes: things we can do.

 

Alternatively, we could run some program f on our message and get some output f(m) - that's what the horizontal arrow at bottom means.    Then we could encrypt that and get the encrypted output enc(f(m)).

 

But it would be really cool if anyone in the world could produce this encrypted output  starting with the encrypted message, but without knowing how to decode that encrypted message.

 

And they can.   There is a program, called eval in this picture, which takes enc(m) as input and produces the  encrypted output enc(f(m)).  

 

So: going up the left-hand arrow and then across the top of the rectangle is the same as going across the bottom and then going up the right-hand arrow.  This is a very standard idea in category theory. 

 

In short: eval(enc(m)) = enc(f(m)).

 

But the cool part is that anyone in the world can have access to the program eval but still not be able to decode your encrypted message.  So, people can do useful things with your data without ever knowing your data.  Privacy is preserved.

 

Thanks to Alexander Kruel for pointing this out.

Link to comment
Share on other sites

 Share

×
×
  • Create New...