What I've learned about combinatorics
I've been studying this funky discrete math branch for couple of weeks now.
No proper insight in HS got interested now. Patterns  combinations of patterns, noticing
the patterns and manipulating the patterns. And its discrete. In very abstract way
I can see the connections to code analysis and visualisation. In general the combinatorics
problem is to find the size of subset when specific predicate is given { (a b c...)  a b c... has some property }
Example  we have a set [a b c ] we generate all six permutations for a new set, we ask
what is the size of a subset { (x y z)  x = a, y != a, z != a }. Basically how many elements there are
which start with a. Quick combinatorial thinking says that every element in our subset has
form [a _ _] so 2 places left. Then how many ways to fill those places? We have c d elements left
and _ _ places to fill. [a b c] [a c b]. We can imagine it as a graph.
BC

A

CB One way to fill first spot. Two ways to fill second spot. One way to fill the last spot.
We had the predicate(1st element must me a) We find the pattern( size[b,c] * size[b] = size[b,c] * size[c] ).
We extracted the info(made a nice graph).
[A B C D] how many ordered elements we can make where A is always below B?
[A _ _ B] [_ A _B_]... Let's build the solution this way. How many ways I can arrange A B in 4 spaces?
[_ _ A B] [_ _A B] [A B _ _].. and so on. Count. A can go to 3 places [A _ _ _] [_ A _ _] [_ _ A _]
for the first permutation B has three places above A. For the second  two. For the third  one.
We have half solution which visually looks like:
[A B _ _]
[A _ B _]
[A _ _ B]
[_ A B _]
[_ A _ B]
[_ _ A B]
The pattern here is that every element can branch all ways which we can fill last _ _ elements.
We have only C D left its basically the same as in the first problem. 2 ways for the first spot.
Times 1 way for the second. Six branches. Every has two branches.
12.
The answer is not important. The process is.