A while back my supervisor tried to write me up. I handed him my written response and he looked at me and said, "that was really well written". Then he dropped the write up.
You guys see confusing FPS and hz the brain can only process about 30 fps any over that is waste. How that translates to screen refresh rate is beyond me though.
The refresh rate on a display is essentially the hard-wired "frames per second" of the display hardware. If a display is running at 30 hz, every 33 milliseconds, it's going to take whatever is in video memory, and show it on the screen. (It's actually a little more complicated than that, but for our purposes, it's a reasonable enough definition.)
FPS is essentially the rate at which you're updating video memory. For TV, or DVDs and such, this is also essentially hard-wired, either on the disc/DVD decoding or in the broadcast standard. If your 'fps' doesn't match your devices refresh rate - you'll either be displaying some frames twice (no big deal), or if the display is slower than the source, you'll end up with dropped frames, which is no good and things will start to look jumpy. There's plenty of movies shot at 24p because that's an old standard from mechanical cameras that a lot of directors think feels more 'cinematic'. I don't really understand the logic of 240hz TVs when a lot of your source material only changes at 24 hz.
For video games, it's different because the FPS isn't locked to an update time. Every frame takes a different amount of time to draw, depending on what's going on in-game. You're also sampling input, and effecting the 'world' based on that. Ideally, you want to be running at greater than 30fps, so everything looks like smooth motion, but you also don't want huge discrepancies in the time it takes to draw any given frame, or things start to feel really janky, input feels really inconsistent, etc.