Shared memory is not always a protected resource. As such many users can allocate shared memory. It is also not automatically returned to the memory pool when the process which allocated it dies. This can result in shared memory allocations which have been allocated but not used. This results in a memory leak that may not be obvious.
By keeping shared memory limits low, most processes which use shared memory (in small amounts) can run. However, the potential damage is limited. The only systems I have uses which require large amounts of shared memory are database servers. These usually are administered by system administrators who are aware of the requirements. If not, the DBA usually is aware of the requirement and can ask for appropriate configuration changes. The database installation instructions usually specify how to calculate and set the appropriate limits.
I have had databases die and leave large amounts of shared memory allocated, but unused. This created problems for users of the system, and prevented restarting the database. Fortunately, there where tools which allowed the memory to be located and released.
Most browsers enable you to get a warning if you have HTTP content on an HTTPS page. This can be very annoying if you visit sites that mix HTTP content on their HTTPS pages. From your question it appears Wikipedia is one of those. When properly set, Firefox warns me visiting this page.
A web server is not required to offer HTTPS. Many sites do not offer HTTPS, and other may only use it to secure login screens and other content that they deem requires a secure path. Even if you use HTTPS, it is still possible to determine which servers you are browsing. In many cases the server only hosts one site, so the site would be known as well.
Until recently, the certificates required for HTTPS were quite expensive. Depending on the level of trust required, the cost is still high. Banks and other organizations which require a high degree of trust and security will pay high prices for their certificates.
If you wish to hide your traffic from local monitoring, you could use a secure path to a proxy. This may raise red flags with whoever is monitoring your traffic.
If you use a private proxy, anyone downstream of the proxy would be able to determine much of the information you are trying to hide.
Best Answer
A couple of reasons:
The BCrypt-based scheme isn't NIST approved.
Hash functions are designed for this kind of usage, whereas Blowfish wasn't.
The added security is BCrypt is based on it being computationally expensive, rather than the type of algorithm. Relying on computationally expensive operations isn't good for long-term security.
See http://en.wikipedia.org/wiki/Crypt_%28Unix%29 for some discussion on this.