There's no formula here. You should limit your database according to what you think is reasonable according to your application needs.
Typically, servers with applications using a connection pool shouldn't need more than a few hundred concurrent connections. Small-Medium sized websites may suffice with 100 - 200.
I usually setup a new server with some 500 - 800 value of max_connections
and see how it goes. You can always change dynamically via
set global max_connections := 567;
Make sure, though, you set up a proper open_files_limit
. On linux, your process is limited to 1024 files, by default. This is very low, since every thread, connection, and, of course, file -- make for a file handle in linux. So set open_files_limit
to some generous number (say 8192) to clear up your many connections with the operating system.
I should note I have worked with MySQL servers with thousands of open connections - it's cool. But, most of the time, the vast majority of these connections would sit and do nothing (be idle).
To sum up, I would use what appears to be normal application needs + some threshold for spike events.
Best Answer
It is the number of connections. It is not the number of users, as one user can have many connections concurrently and each one counts against max_connections. And you can also have many more users than connections, as long as users are not permanently connected.