1. One scenario in which support vector mechanics might not be the right choice would be one in which it's hard to define a certain grouping or pattern that fits all the data. An example might be determining items to suggest in an online store such as Amazon or the iTunes store. Since categorization of different music, movies, books, etc. according to preference is by definition highly subjective and different for each person (someone who likes Studio 60 might hate the West Wing, even though they are both TV dramas made by Aaron Sorkin), it is not likely that support vector mechanics could come up with suggestions of items that any particular person actually wanted to buy. 2. Bayes Law is useful because it allows us to find out the likelihood that an event A happened given a resulting event B. If we have Pr(A) -- the likelihood that A will happen, regardless of anything else; Pr(B) -- the likelihood event B will happen, regardless of anything else; and Pr(B|A) -- the likelihood that B will happen as a result of A, we can calculate Pr(A|B) -- the likelihood that an event A resulted in an event B. This gets around the fact that "if B then A" and "if A then B" are not the same thing. For a real world example, if we know the likelihood of a car's transmission failing, the likelihood of it running out of gas, and the likelihood of the engine failing (three instances of Pr(A)); the likelihood of the car breaking down (Pr(B)), and the likelihood that the car will break down because of one of these events (Pr(B|A)), we can figure out why our car has broken down. (This is assuming that these are the only things that can happen to a car and that they cannot happen all at once. This example is contrived, I know, but it's the best example I could think of that is in some way useful to real life.) This knowledge is useful for a host of things, including insurance claims.