Framerate is the frequency at which images appear on a screen or display; (modern) displays work by displaying a series of frames (basically ‘still’ images) in rapid succession, causing our visual system to see motion. Framerate is expressed in Hz (Hertz), or FPS (Frames Per Second); e.g. a 60Hz monitor will display 60 frames per second.
Framerate is important in gaming for a number of reasons, most importantly for fluidity. The higher the framerate, the more fluent the image on the screen becomes, thus making it easier and more natural to track fast moving objects on the screen.
The above image shows why higher refresh rate monitors have become the standard in the competitive gaming scene. A regular 60Hz display (which is by far the most common refresh rate for displays these days) can refresh the image 60 times per second at most, so even if your GPU is capable of rendering far more FPS you’ll still only see 60 images per second. A 144Hz monitor (which is now the standard refresh rate for serious gaming monitors) will more than double the amount of images displayed per second (provided your hardware can push enough frames, having the hardware to do so is an important consideration when buying a high FPS display), resulting in a much clearer and more fluid image.
Modern gaming monitors go up to 240Hz, providing exceptional clarity and fluidity. Some people claim (jokingly so, in most cases) that the human eye cannot see over 30 frames per second but these claims are false and originate from an internet meme. Gaming on a higher refresh rate monitor won’t necessarily make you a better player, but the effect is instantly noticeable , though with diminishing returns. Going from 60 FPS to 144 FPS will be a world of difference and might even make it uncomfortable to play a fast paced game on a regular 60Hz monitor, whilst going from 144Hz to 200Hz will be a less eye-opening experience (though the difference is still noticeable for most people).