Multiple connections, happening roughly within a 500 milliseconds timespan, can be called a concurrent connection.
So when looking at new server hardware, you have to think hard, if you will actually hit a certain concurrent connection limit at some point.
REAL LIFE EXAMPLE
So lets say, you publish a campaign in a magazine, promoting a special offer through a single page website.
To calculate the possible concurrent connections, you would need to know roughly how many readers the magazine has.
The chance that all of them reading and visiting at the same time is almost impossible.
So lets say the magazine has 50.000 readers and roughly 1% hit your website at the same time.
In that case you would need a server setup that can handle 500 concurrent users.
These things are more important, when building Mobile Apps. With a popular App you can easily hit those concurrent user limits. This is where cloud solutions become really handy and help to level the traffic requirements. A good example is Parse.
The chance of your server hitting a concurrent connection limit is often not as critical as hitting a RAM limit :)