Spatial Quantum Search
Searching on a Lattice
So if we’re going to do spatial search, we need to have a model of spatial properties of the database. The more obvious physical implementations of a quantum database are discretized in such a way that they are (to first order) analogous to a square lattice (actually any graph will do, but square lattices seem most intuitive). Each vertex on the graph can store once piece of data (I’m intentionally leaving ‘piece’ undefined), and information can only directly flow from one vertex to another if they are neighbors. Note that by “square lattice”, I don’t mean necessarily that we’re working with a two dimensional, four sided polygon whose edges all have equal length. I mean instead that the vertices of the lattice are connected to one another as they are on linear graph paper; the vertices are arranged and connected in squares (or the generalization of such connectivity in other dimenstions).
So, why is spatial search more interesting? Because information can only move at (at most) the speed of light. If two places are separated in space, it takes time for information to go from one place to the other. So, if you have a database with spatially separated storage sites, it could take more time to traverse the database looking for the piece of data you want than if the database were not spatially separated.
Lower Bounds
If you consider a one dimensional square lattice with N sites (a line of length N), the amount of time to go from one end of the line to the other will scale like N; spatial search on a line takes Ω(N) operations (the number of operations is lower bounded by something that scales like N). This sort of lower bound is called the ‘diameter bound’; the time it takes to get between the two farthest sites on the lattice is (at least) proportional to the distance between those two sites (the ‘diameter’). Spatial search is always limited by this diameter lower bound, no matter what kind of lattice.
So what do we know now about lower bounds? We know that Grover’s algorithm is optimal for the spatially non-separated case, so we know that if we make things harder by spatially separating everything, we can do no better than something that scales like [tex]\sqrt{N}[/tex]. Not being able to to better is a lower bound! So, we know that spatial search is Ω(diameter) (the diameter bound) and [tex]\Omega(\sqrt{N})[/tex] (the Grover bound). Whichever one is bigger is the better (i.e. tighter) bound. In the one dimensional square lattice case, the diameter bound wins.
But what about in higher dimensions…?
very neat. I wonder how useful you find the big-Oh notation? There was an article in the recent Comms of the ACM about how CS would be more of a big-s ‘Science’ if we could transcend big-Oh.
What’s your take on that?
I find the notation very useful. Mostly because I need to distinguish between upper bounds, lower bounds and tight bounds all the time.
I’m not sure, though, what you mean by transcending big-Oh. Can you explain what the communication proposes/pointed out or give a link?
The direct quote is:
— Is Computer Science Science? Peter J. Denning, Communications of the ACM Vol 48, No 4, P27-31
I could see how in some more developed areas of computer science, more exact calculations of efficiency are needed. But I still think big-O gives one a simple way of finding revoluationary new algorithms — ones that aren’t going to shave a few processor cycles off here or there, but will be a whole new class of algorithm. And if quantum computing didn’t hold the promise of doing this in at least a few areas, there would be a lot less reason to work on it.
MDA, remind me to ask you at some stage exactly how this search algorithms work, especially the spacial ones — I’m interested to hear how one incorporates growing amounts of entanglement into the algorithm, as more data arrives — if indeed that is what is done.
Uber, Well… we don’t have any real sets of data. I suppose it’s possible, but some better measure will entirely depend on the physical implementation of the quantum computer (which does not yet exist).
Better, theoretical models can be used to model the system (please note I’ve played a little fast and loose with the one I’ve described), but they become intractable fast. I think it’s typically easier to roll higher order effects into your error analysis.
Also, be wary of changing the problem under examination. Take as an example classical search on a database about whose structure you know nothing. As soon as you mention real world data, you’re probably talking about a different problem: classical search on a database about whose structure you can get away with assuming something(s).
Which problem is more useful is up for debate.
Paul.za, that’s not really what happens, though perhaps I don’t understand your question. Most of the search problems we consider only deal with finding one piece of information. What the rest of the pieces in the database are we never care about.
Most algorithms set up a clever Hamiltonian and evolve the system for a fixed time. If the algorithm does it’s job, you end up in the desired location with high probability.
At the moment, I’m working on lower bounds, though. Currently, I’m a bit less familiar with algorithmic work, which deals with upper bounds.
[…] uation Filed under: Uncategorized — jjk @ 12:33 am Following the lead of Paul and Mike, I have decided to write a little bit about what I work on at Caltech. But, instead of writing […]
Yeah yeah yeah… computer modeling or pencil-and-paper or whatever you do, that’s all very nice. But can you Crochet it?
Hey Mike. I found your site through Holly’s. This is interesting stuff, and while I don’t understand most of it, I am excited about the implications of quantum computing. I’ll be able to play cooler video games at higher frame rates!!
L, no I cannot
:)
James, thanks for dropping by. I’ll let you know when the first Quantum Algorithm based version of the Half Life engine comes out.
[…] Let me offer you my perspective; Greg’s mileage may vary. Over the past several years (perhaps since senior year of college), my attention span has been diminishing (thankfully, only algebraically). I blame only myself and my lack of rigour. Nevertheless, my lacking in the powers of concentration has severely limited my ability to perform certain tasks like determining a tight bound on spatial quantum search, understanding the nuances of quantum pattern matching, or shaving. […]