An algorithm is essentially a well defined set of instructions that get carried out by a computer in an automated fashion to solve a problem. A good example of this is to say "How would you tell a computer to figure out which of the 5 balls I've given to you is the heaviest (or lightest)". In order to solve this "problem", you'll need to define a set of steps for the computer to carry out in order to reach a conclusion and solve the problem.
Algorithms are very common in programming, as you are constantly trying to tell the computer how to solve problems in a step by step manner.
The Big-O Notation is the way we determine how fast any given algorithm is when put through its paces.
Consider this scenario: You are typing a search term into Google like "How to Program with Java" or "Java Video Tutorials", you hit search, and you need to wait about 30 seconds before all of the results are on the screen and ready to go... Would you still use Google? Or would you start shopping around with other search engines to find one that is faster? My guess is you'd start shopping around.
Speed is everything these days, and building slow software is infuriating to users even if they aren't even paying for the software.