What is a simple way to learn "factors and algorithms"?
I. If you are asking about integer factorization, the most important things to know are the multiplication tables and the divisibility rules. Memorize the multiplication tables.
The divisibility rules include all even numbers are divisible by 2 (being divisible by 2 means that 2 is a factor of the number); a number where the sum of its digits is a multiple of three is also a multiple of 3 (e.g. 3 divides 123 since 1+2+3=6 is a multiple of 3); if the last two digits are a multiple of 4 then the entire number is a multiple of 4; if the number ends in 0 or 5 it is a multiple of 5; if the number has 2 and 3 as factors then 6 is also a factor; if the last three digits is a multiple of 8 then 8 is a factor; if the sum of the digits is a multiple of 9 then 9 is a factor of the number.
The standard method of factoring an integer is a factor tree.
II. If you are asking about factoring polynomials there are a number of algorithms to assist you. Note that guess and check is always a viable, though not always efficient, method to factor a polynomial. The other algorithms can be more efficient (e.g. faster) but are not necessarily better. It is like being a carpenter -- having access to 12 different sizes of hammer is nice, but you could probably get the job done with 3 or 4 sizes.
The typical algorithms to know address special cases and you will want to recognize these when they come up. So it is not only a matter of memorizing, but also one of pattern recognition. For example, the difference of 2 squares factors as a^2-b^2=(a+b)(a-b) regardless if a and b are numbers, functions, polynomials, etc... So dog^2-cat^2=(dog + cat)(dog - cat) for whatever "dog" and "cat" are.
The typical algorithms include the difference of 2 squares, the sum or difference of cubes, perfect square trinomials and factoring by grouping.