It's like you're in one part of the forest and he's telling you about a set of trees in another part of the forest. And he's not even talking about trees in his part of the forest that are next to each other, but there's one tree he's seen that relates somehow to another important tree he's seen. But he's none too clear even about how those two trees are connected to each other... in some unspecified part of the forest you're not in.
What you need is directions to get from your part of the forest to the part he's in. Then you need to know how to get from one of the trees he's mentioning to another. This height of unnecessary confusion makes me angry, because it's completely avoidable. I'm convinced I could do much better and probably will.
For the record, I don't think it's intentional. I think Susskind really wants to be clear. He just knows the forest too well to tell someone in words how to get around in it, at least someone who's never taken a course in linear algebra. I've wondered if the jokes at the beginning of the chapters are revealing. They seem to demonstrate an inability to grasp what is funny.
But I want to force myself through the book. Richard Feymann once told his sister to read and reread the math and science she didn't understand. I'm convinced this is the way to go with many difficult subjects and authors. So here I go again with Susskind.
2. Chapter one should not be the first chapter. At least a great deal of what is in here should not be first. Most people need to know why they need to know something for the something to stick and make sense. So I have come to realize that there is some linear algebra in this first chapter. It uses notation that Paul Dirac introduced to quantum mechanics I think in the 1940s.
Wrong place to begin. He's thinking. We learn bras and kets, then we use them in later chapters. But bras and kets make little sense when you have no idea what they're for. I know complex numbers, but for someone who doesn't, it would be better to introduce them when we need to know them. Show us the problem they help solve and introduce them there.
3. So what is helpful to take away from chapter one at the beginning? Here is some stuff from the beginning of the chapter:
- The idea of state is fundamental to quantum physics. For the moment, let's talk about the most fundamental state as being either on or off, +1 or -1. Let's call this "two-state system" a one bit system, a quantum bit or "qubit."
- We could call this "either on or off" the quantum spin. It's not a literal spin.
- Experiments are never gentle. You measure one thing, you mess up everything else. You've lost information from the other place because you've chosen to measure this place. (think Heisenburg's Uncertainty Principle)
- What is predictable on the quantum level is not the individual outcome of some measurement, but the statistical average. Individual outcomes are not predictable, but the averages are.
- The quantum mechanical notation for the statistical average is Dirac's bracket notation: 〈Q〉 .
- The "space of states," the possible values or states of something is a "vector space" in quantum physics. (linear algebra) Another name for such a "space" is a Hilbert space. There could be an infinite number of elements. This is all very abstract. For the moment, I'm just picturing a box you put stuff in, and different boxes will only take a certain number of things.
- The elements of a vector space are called kets or ket-vectors. The notation Dirac used for these is ∣A〉
- The elements of a ket are often a column of complex numbers. We are being set up for matrix multiplication.
- The "row" matrix that is pit against the "column" matrix of the ket is the bra. The bra looks like this: 〈B∣ .
- bra-ket. First in the row multiplied by the first in the column and so forth. The inner product of two of these vectors is the result of this sort of operation.
- A vector is normalized if its product with itself is 1.
- A vector is orthogonal if its product with itself is 0.
- The dimension of a vector space is the maximum number of orthogonal vectors in that space. These vectors are orthonormal bases in relation to each other.
- Finally, there is something called the Kronecker delta. As far as I can see, he never tells us what this is. I know the name from somewhere else. The Kronecker delta is symbolized as δij . This function is 0 if i and j have different values and 1 if i and j have the same value.
No comments:
Post a Comment