Saturday, March 7, 2009

Speed up your Visual Studio’s Website Building & Publishing

 

The problem

Recently, I had been experiencing a very annoying problem regarding my project’s publishing. In my company it is a common practice, that we publish the website we are going to deploy, to a local folder with “Single File Assembly” option ON and then upload the necessary files that has been changed via FTP to the staging or production server. Recently, when ever I was tying to publish the website, Visual Studio was giving output after trying to pre-compile as --

“Object <436114d4_6fca_4720_9ef7_fcd21e349d9d>/<Z3qAWE53tdTYmJNo8I3Wn6ys_320>.rem  has been disconnected or does not exist at the server.”

(The alpha-numerics are random of course as it seems to be.)
I googled this problem and found an interesting fact that was causing the problem. When pre compiling, an object was being created and as the publish is taking too long to execute, the object is being Garbage Collected and as a result the object becomes unavailable to the process. So, I found out the “project building time” was the main culprit here. The project I was working on is pretty large (~6.66GB) and I never dared to press F5 while trying to test something because if somehow I pressed F5, I had to go for a coffee and sometimes, even after my coffee break of 15-20 minutes I had to come back only to find the build is still going on….

The Search of Solution

Finally, I got frustrated and searched desperately to solve the problem. One solution I could think of was to increase the RAM of my machine and that required organization’s approval. I have 1 GB of RAM in my machine and I am running Windows XP Service Pack 3, I thought it was enough of RAM for a machine running XP and Visual Studio 2005. Anyways, I opened the system properties box by right clicking on “My Computer” and clicking on “Properties” in the context menu. Went to “Performance” section and selected the radio button which assures to give “Performance” the highest priority. More over, I killed some unnecessary processes from Windows Task Manager - especially the “Search Indexer”. Now my machine was faster than lightning! So I started publishing again.

Output:

“Object <some alphanumeric>/<some_alphanumeric_number>.rem  has been disconnected or does not exist at the server.”

Duh! So much for doing so many things. I started searching again. And this time God said “Let there be light…” and of course .. ”there was light!”.

Solution

After a long search I detected the culprit of the “long building time” mischief. This was being generated by some conflicting dll.refresh files.

When ever someone adds a reference of an assembly to an application the dll.refresh file gets generated automatically. This .refresh file actually saves the relative path for the original dll file from where the dll file was referenced. At the time of building, the Visual Studio check those dll.refresh files to find whether any updates has been made to the original dll file and if it finds an update, then it automatically updates the referenced ones too accordingly. If there are dependent assemblies, then Visual Studio also copies them too to the bin folder.

So to restrain Visual Studio from auto checking the updates, the dll.refresh files can be deleted which in turn solved my long building time problem. Believe me, the building time change was drastic.

This I think is a good solution to the slowdown problem because, even if the original dll gets updated you can always make a reference which will update your application. Generally within short term projects  we do not change third party dlls very often, so they can be deleted without any hesitation. Of course if you have used a class library of your own which gets changed every now and then, it is better to keep the refresh file intact. So, what was the problem and where is the conflict?

Scott Guthrie, calls this problem the Dueling Assembly Reference Problem. You can get a wonderful detail in his blog here.

Friday, March 6, 2009

How to make your Website McAfee Secured

 

Recently one of my clients requested, after a critical attack on the server where an application was running, to make sure that the application is secured in all perspectives. We chose McAfee to ensure that the website is 100% hacker proof. After a thorough Application Scan, Network Scan, and PCI Scan, McAfee found out several vulnerabilities in the Website, and I thought some of them exist in most of the websites we make. So here is a brief list and solutions we have found out to make a website 100% McAfee Secured and PCI Compliant.

Before we delve into the technical details, I want to clarify what this article covers and what it does not, better not to create any false expectations. This article is solely based on my experience while solving the vulnerabilities, and I have presented them likewise. As several of the fixes require tweaks in registry, make sure to back the registry up, and change anything at your own risk.

Following are the most important vulnerabilities that were reported by McAfee:

1  Vulnerability 1

1.1. Vulnerability Name:  Microsoft .net Custom Errors Not Set
1.2. Description:

This is when the custom error page is not set for an application. We generally use the web.config to specify the Error Page. But McAfee does not recognize that.

Solution:

In order to fix this vulnerability the error pages must be set from the IIS. The steps are as follows:

    • Go to IIS Manager
    • Right click on the project name that has to be McAfee Secured.
    • Click Properties.
    • In the Properties Box, click on the “Custom Errors” tab and the following window appears:

Fig1

    • Highlight any one of the error type and click on edit properties and a small window appears where the custom error page can be specified.

Fig2

    • Click on OK button in both this and its parent window.
    • If even one error page is specified for any one error McAfee considers this a vulnerability no more.

2  Vulnerability 2

2.1  Vulnerability Name:  Allow All Policy in crossdomain.xml
2.2  Description:

If a website uses several number of flash videos and content and other websites use them in their websites, the crossdomain.xml comes into play. This xml file sets the policy about who can use the content in their websites in terms of domain name. It has a key called –
<allow-access-from domain="*"/>
According to McAfee this poses a serious threat of hack attack to the website.

Solution

It is very confusing how McAfee scans the file, because even if the file is removed from the root (wwwroot), and a copy of the file still exists in any place of the whole server, McAfee still finds that and gives alert. The best solution is to Edit each and every occurrence of that file in the whole server and remove all stars (*) and give specific domain names from which the videos will be accessed. If not required it is best to completely delete the file.
So the edited file will look like: 
<allow-access-from domain="*.yahoo.com"/>
(* denotes anything that prepends the domain name e.g. messenger.yahoo.com in also all protocols: http or https

Vulnerability 3

3.1  Vulnerability Name:  Using SSL v2 in secure communications
3.2  Description:

The remote service appears to encrypt traffic using SSL protocol version 2.
Netscape Communications Corporation introduced SSL 2.0 with the launch of Netscape Navigator 1.0 in 1994 and it contains several well-known weaknesses. For example, SSLv2 doesn't provide any protection against man-in-the-middle attacks during the handshake, and uses the same cryptographic keys for message authentication and for encryption.
In Internet Explorer 7, the default HTTPS protocol settings are changed to disable the weaker SSLv2 protocol and to enable the stronger TLSv1 protocol. By default, IE7 users will only negotiate HTTPS connections using SSLv3 or TLSv1. Mozilla Firefox is expected to drop support for SSLv2 in its upcoming versions.
As almost all modern browsers support SSLv3, disabling support for the weaker SSL method should have minimal impact. The following browsers support SSLv3:

    • Internet Explorer 5.5 or higher (PC)
    • Internet Explorer 5.0 or higher (Mac)
    • Netscape 2.0 (Domestic) or higher (PC/Mac)
    • Firefox 0.8 or higher (PC/Mac/Linux)
    • Mozilla 1.7 or higher (PC/Mac/Linux)
    • Camino 0.8 or higher (Mac)
    • Safari 1.0 or higher (Mac)
    • Opera 1.7 or higher (PC/Mac)
    • Omniweb 3.0 or higher (Mac)
    • Konqueror 2.0 or higher (Linux)
      According to https://www.pcisecuritystandards.org/pdfs/pcissc_assessors_nl_2008-11.pdf, an Assessor's update report, "...it is imperative that an ASV identify the use of SSL 2.0 to transmit cardholder data as a failure."
Solution:

SSL related protocols can run on other service ports as well. Typical ports include: 465, 993, 995, 2078, 2083, 2087, 2096, 8443, etc. Each application will have its own configuration options to handle SSL protocols.
To Solve this problem one needs to open Registry Editor.

    • Click Start, click Run, type regedt32 or type regedit, and then click OK.
    • In Registry Editor, locate the following registry key:
      HKey_Local_Machine\System\CurrentControlSet\Control\SecurityProviders \SCHANNEL\Protocols\SSL 2.0\Server

Fig3

    • On the Edit menu, click Add Value.
    • In the Data Type list, click DWORD.

Fig4

    • In the Value Name box, type Enabled, and then click OK. (The value will be automatically set to 0 – Disabled )

Fig5

4  Vulnerability 4

4.1. Vulnerability Name:  Weak Supported SSL Ciphers Suites

4.2.   Description:

The remote host supports the use of SSL ciphers that offer either weak encryption or no encryption at all. This vulnerability is valid for all SSL/TLS sessions that are passing sensitive information.
PCI defines strong cryptography, for secret key based systems, as anything above 80 bit encryption.

Solution:

The solution to this is very simple but requires registry tweak again. Following are the steps.

    • Click Start, click Run, type regedt32 or type regedit, and then click OK.
    • In Registry Editor, locate the following registry key:
HKey_Local_Machine\System\CurrentControlSet\Control\SecurityProviders \SCHANNEL\Ciphers

Fig6

    • Under the Cipher key there are several Ciphers.
    • Locate the ciphers which have encryption less than 128 bit
    • Create DWORD values named Enabled and Value 0 for each of them, just as the previous case.
    • For convenience I have marked them with red Arrows in the picture above.
    • System Restart is NOT required for this.

    Now the server is secured.
    The above mentioned security issues are the major ones that most of the systems have. However other than this there may be some easy and minor vulnerability like –

      • Using robots.txt in the pages.(Generally inserted by Web Marketing team to track user hit).
      • Directory Scanner: Common directories are revealed. This can be resolved by URL rewriting and setting “Directory Browsing” off.

    Note: For the above vulnerabilities a minor registry tweak will be necessary. So it is strongly recommended to back up the registry before doing anything. By any chance if something gets messed up, just delete the SCHANNEL key and restart the machine, the key will be auto-generated.