Spatial Quantum Search

Filed under: a group of folks,academe,physics @ 01:03

Searching in Higher Dimensions

If we examine a cubic three dimensional square lattice (the simple cubic crystalline structure) with N sites, we see that the diameter is [tex]\sqrt[3]{N}[/tex]. So the diameter bound is smaller than the Grover bound of [tex]\Omega(\sqrt{N})[/tex]. Indeed, tight algorithms (and [2] [3] [4]) that scale like [tex]\sqrt{N}[/tex] have been developed for searching square lattices of three dimensions and higher.

It’s in two dimensions that things get tricky. The diameter of a square two dimensional square lattice is [tex]\sqrt{N}[/tex]. So the diameter bound and the Grover bound are comparable. It’s unclear how they will interact; the best upper bound currently known for this lattice is [tex]O(\sqrt{N}\log{N})[/tex] and the best known lower bound is Grover’s: [tex]\Omega(\sqrt{N})[/tex].

So that’s the background. What am I doing? I’m trying to tighten the bounds for the two dimensional case (both from above and below), and also to figure out exactly how the bound changes as we move from one dimension to three. For the line, we have a tight bound that scales like N, and for the cube, we have a tight bound that scales like [tex]\sqrt{N}[/tex]. Where, how and exactly why does it change?

The method by which I try to figure out these questions? I count. A lot.


    Uber 04.17.2005 @ 17:53

    very neat. I wonder how useful you find the big-Oh notation? There was an article in the recent Comms of the ACM about how CS would be more of a big-s ‘Science’ if we could transcend big-Oh.

    What’s your take on that?

    MDA 04.18.2005 @ 00:45

    I find the notation very useful. Mostly because I need to distinguish between upper bounds, lower bounds and tight bounds all the time.

    I’m not sure, though, what you mean by transcending big-Oh. Can you explain what the communication proposes/pointed out or give a link?


    The direct quote is:

    Experimental Algorithmicists study the performance of real algorithms on real data sets and formulate models to predict their time and storage requirements; they may one day produce a more accurate theory than Big-O-Calculus and include a theory of locality

    Is Computer Science Science? Peter J. Denning, Communications of the ACM Vol 48, No 4, P27-31

    Paul.za 04.18.2005 @ 10:14

    I could see how in some more developed areas of computer science, more exact calculations of efficiency are needed. But I still think big-O gives one a simple way of finding revoluationary new algorithms — ones that aren’t going to shave a few processor cycles off here or there, but will be a whole new class of algorithm. And if quantum computing didn’t hold the promise of doing this in at least a few areas, there would be a lot less reason to work on it.

    MDA, remind me to ask you at some stage exactly how this search algorithms work, especially the spacial ones — I’m interested to hear how one incorporates growing amounts of entanglement into the algorithm, as more data arrives — if indeed that is what is done.

    MDA 04.18.2005 @ 14:49

    Uber, Well… we don’t have any real sets of data. I suppose it’s possible, but some better measure will entirely depend on the physical implementation of the quantum computer (which does not yet exist).

    Better, theoretical models can be used to model the system (please note I’ve played a little fast and loose with the one I’ve described), but they become intractable fast. I think it’s typically easier to roll higher order effects into your error analysis.

    Also, be wary of changing the problem under examination. Take as an example classical search on a database about whose structure you know nothing. As soon as you mention real world data, you’re probably talking about a different problem: classical search on a database about whose structure you can get away with assuming something(s).

    Which problem is more useful is up for debate.

    Paul.za, that’s not really what happens, though perhaps I don’t understand your question. Most of the search problems we consider only deal with finding one piece of information. What the rest of the pieces in the database are we never care about.

    Most algorithms set up a clever Hamiltonian and evolve the system for a fixed time. If the algorithm does it’s job, you end up in the desired location with high probability.

    At the moment, I’m working on lower bounds, though. Currently, I’m a bit less familiar with algorithmic work, which deals with upper bounds.


    [...] uation Filed under: Uncategorized — jjk @ 12:33 am Following the lead of Paul and Mike, I have decided to write a little bit about what I work on at Caltech. But, instead of writing [...]

    L 05.06.2005 @ 01:22

    Yeah yeah yeah… computer modeling or pencil-and-paper or whatever you do, that’s all very nice. But can you Crochet it?

    James 05.09.2005 @ 15:01

    Hey Mike. I found your site through Holly’s. This is interesting stuff, and while I don’t understand most of it, I am excited about the implications of quantum computing. I’ll be able to play cooler video games at higher frame rates!!

    MDA 05.09.2005 @ 15:06

    L, no I cannot :)

    James, thanks for dropping by. I’ll let you know when the first Quantum Algorithm based version of the Half Life engine comes out.


    [...] Let me offer you my perspective; Greg’s mileage may vary. Over the past several years (perhaps since senior year of college), my attention span has been diminishing (thankfully, only algebraically). I blame only myself and my lack of rigour. Nevertheless, my lacking in the powers of concentration has severely limited my ability to perform certain tasks like determining a tight bound on spatial quantum search, understanding the nuances of quantum pattern matching, or shaving. [...]

© mdawaffe (Michael D Adams) - Powered by WordPress - Full Credits