Second installment reviewing Susskind's, Quantum Mechanics: The Theoretical Minimum. Here was the first.
1. I've read this chapter of Susskind several times. I have some sense of it but continue to find the writing frustrating. I have the sense that it would not be hard to make this material more comprehensible to an invested lay person like me, but it seems written by and for Sheldon. At some point it will more fully click and I will be able to add the necessary paragraphs.
2. Section 2.1 So I understand what he's saying in this first section and can already add the necessary background. There is an almost century old debate in quantum physics about whether quantum uncertainty is due to there being hidden factors or "hidden variables" that would make the quantum world predictable. The majority don't think there are. They just think the quantum world has a fundamental uncertainty built into it.
3. Section 2.2 This section sheds a little light on the first chapter and the bras and kets of linear algebra. Still, it feels like Men in Black where Tommy Lee Jones is mid-conversation with Will Smith after flashing him with the memory thingee.
We still don't know what "spin" is but it is apparently the most fundamental quantum characteristic. For you chemistry buffs, I believe it relates to the final options in the 1s2, 2s2, 3p6 stuff. The two electrons in the s orbital, for example have two different spins. One is said to have a +1/2 spin and the other a - 1/2 spin. Why couldn't he have told us this? Give us something to hold on to, man.
"All possible spin states can be represented in a two-dimensional vector space" (38). That's how he puts it. My interpretation: the final quantum description has only two possible states.
4. So here is how they describe this sort of state, apparently:
∣A〉 = αu∣u〉 + αd∣d〉
First impression is of course that this is unnecessarily complicated but I'm sure it's helpful. And I like Dirac so I'll stomach it. But it sure would be nice if someone gave a straightforward explanation. As best I can tell, here's the explanation he never really gives.
a. ∣A〉 is a ket. It is a box in which we put one characteristic of the quantum situation.
b. There are two possible states for that characteristic. Say it is spin. We might say that spin can be up or down. ∣u〉 is the place we check the "up" box. ∣d〉 is where we check the "down" box."
These are "basis vectors." They are like the x, y, and z axes in normal geometry, but we can't picture the nature of basis vectors in the quantum world.
I believe up and down are "orthogonal." That is to say, it can't be both. If the state is up, it cannot be down. If it is down, it cannot be up.
c. αu and αd is the value, the component that relates to the up and down. These apparently are complex numbers (that is, they have an imaginary component). I have a hunch they relate to the values of Schrodinger's equation, but making such connections would be far too helpful for Susskind to mention.
I am making the connection because he calls these values, "probability amplitudes," and mentions that their squares are probabilities. I know from elsewhere that this notion relates to Schrodinger's equation, which is about the possible states an electron can be in.
d. The total probability that the spin is either up or down has to equal 1. It is something. αu*αu + αd*αd has to equal 1, where the * version is the complex complement.
e. "The state of a system" ∣A〉 "is represented by a unit vector" αu "in a vector space of states" ∣u〉. "The squared magnitudes of the components of the state vector" (αu*αu), "along particular basis vectors, represent probabilities for various experimental outcomes" (40).
5. Susskind uses the analogy (I think) of x, y, and z axes. They aren't really spatial coordinates like this. It's an analogy I think to help us understand. What he is trying to picture are quantum categories that are orthogonal to each other just like the x, y, and z axes are orthogonal to each other.
So say the first "axis" we measure is the z axis and then we want to measure the x axis. When we measured the z axis, it had to be either up or down. If we multiply the probability of up times itself and the probability of down times itself and add these two together, it has to equal 1.
If we then move from the z to the x axis, there is half a chance that we will move from it being up to it being right and there is half a chance that it will move from being down to being right. So what value, when multiplied by itself, will yield a half probability of it being right after it being up or down?
∣r〉 = 1/√2∣u〉 + 1/√2∣d〉
Then multiplying this by the equivalent probability for left has to equal 0 because left and right are orthogonal to each other, suggesting the probably for left then has to be:
∣l〉 = 1/√2∣u〉 - 1/√2∣d〉
6. So now he moves on to the y axis. There is a half probability of moving from any of these components to any of the others. For example, there is a half probability that we would go from an "up" state to a "left" state or from a left state to an "in" state. So the probability for "in" times the probability for "left" has to equal 1/2 just like the probability from "up" to "right" needs to be 1/2. As before, the probability of it being both in and out is zero.
I don't quite see how the math works out, but he suggests this means that the probabilities of in and out then turn out to be.
∣i〉 = 1/√2∣u〉 + i/√2∣d〉
∣o〉 = 1/√2∣u〉 - i/√2∣d〉
The reason this seems peculiar to me is because it seems to me that 〈i∣o〉 turns out to be 1 rather than 0, and all the other probability multiplications still have an i in them. Obviously I don't understand something here yet.
7. Another thing I don't understand is what he is calling a phase factor (eiθ). He says it has unit value and that the vectors can be multiplied by it without changing their values. I'm tucking it away until at some point we see why the heck he's telling us about it.
Once again, he describes a lot of trees but has given us no sense of why any of these things are important or that they relate to anything. Waiting to see the relevance...
No comments:
Post a Comment