/complexity theory – CTCs and information time travel — what non-trivial insights do they lead to? (via Qpute.com)
complexity theory - CTCs and information time travel — what non-trivial insights do they lead to?

complexity theory – CTCs and information time travel — what non-trivial insights do they lead to? (via Qpute.com)


Context:

In quantum complexity theory and quantum information, there are several papers which study the implications of closed timelike curves (CTCs). In 2008, Aaronson and Watrous published their legendary paper on this topic which shows that certain forms of time travel can make classical and quantum computing equivalent i.e. quantum computers provide no computational advantage if they can send information to the past through closed-timelike curves. In the paper’s abstract they specifically mention:

While closed timelike curves (CTCs) are not known to exist, studying their consequences has led to nontrivial insights in general relativity, quantum information, and other areas.

Now, this sounds pretty interesting, although I haven’t really read any of the papers related to CTCs.

Questions:

  1. In brief, what really are closed timelike curves?
  2. Would it be possible to give a general idea about why this “time travel” of information can make quantum computing equivalent to classical computing, without diving too much into the details?â€
  3. What other non-trivial insights have the study of CTCs provided us in the context of quantum information?

I do intend to work through some of the papers on CTCs myself and perhaps also write an answer to this question at some point. But the aim of this thread is to gather some ideas about the “big picture” regarding why quantum information theorists and complexity theorists are interested in CTCs.


†: The abstract says that “given any quantum circuit (not necessarily unitary), a fixed-point of the circuit can be (implicitly) computed in polynomial space”. I’m not sure how this relates to sending information via CTCs, causing quantum computing to be no more efficient than classical computing.


This is a syndicated post. Read the original post at Source link .