2/22/2010

PuTTY Connection Manager

Looking for a tabbed version of famous PuTTY?

PuTTY Connection Manager is a free PuTTY Client Add-on for Windows platforms which goal is to provide a solution for managing multiple PuTTY instances. This is one of the most important missing feature of PuTTY (referring to this and this user complaint)...

http://puttycm.free.fr/cms/

Postfix spam control

Postfix 2.x

Open /etc/postfix/main.cf and place the following lines in it (replacing the respective settings if they exist):

vi /etc/postfix/main.cf

[...]
smtpd_helo_required = yes
disable_vrfy_command = yes
strict_rfc821_envelopes = yes
invalid_hostname_reject_code = 554
multi_recipient_bounce_reject_code = 554
non_fqdn_reject_code = 554
relay_domains_reject_code = 554
unknown_address_reject_code = 554
unknown_client_reject_code = 554
unknown_hostname_reject_code = 554
unknown_local_recipient_reject_code = 554
unknown_relay_recipient_reject_code = 554
unknown_sender_reject_code = 554
unknown_virtual_alias_reject_code = 554
unknown_virtual_mailbox_reject_code = 554
unverified_recipient_reject_code = 554
unverified_sender_reject_code = 554

smtpd_recipient_restrictions =
reject_invalid_hostname,
reject_unknown_recipient_domain,
reject_unauth_pipelining,
permit_mynetworks,
permit_sasl_authenticated,
reject_unauth_destination,
reject_rbl_client multi.uribl.com,
reject_rbl_client dsn.rfc-ignorant.org,
reject_rbl_client dul.dnsbl.sorbs.net,
reject_rbl_client list.dsbl.org,
reject_rbl_client sbl-xbl.spamhaus.org,
reject_rbl_client bl.spamcop.net,
reject_rbl_client dnsbl.sorbs.net,
reject_rbl_client cbl.abuseat.org,
reject_rbl_client ix.dnsbl.manitu.net,
reject_rbl_client combined.rbl.msrbl.net,
reject_rbl_client rabl.nuclearelephant.com,
permit
[...]



Restart Postfix afterwards:



/etc/init.d/postfix restart

2/02/2010

Search Engine Optimization Toolkit

The IIS Search Engine Optimization (SEO) Toolkit helps Web developers, hosting providers, and Web server administrators to improve their Web site’s relevance in search results by recommending how to make the site content more search engine-friendly. The IIS SEO Toolkit includes the Site Analysis module, the Robots Exclusion module, and the Sitemaps and Site Indexes module, which let you perform detailed analysis and offer recommendations and editing tools for managing your Robots and Sitemaps files.
Improve the volume and quality of traffic to your Web site from search engines

The Site Analysis module allows users to analyze local and external Web sites with the purpose of optimizing the site's content, structure, and URLs for search engine crawlers. In addition, the Site Analysis module can be used to discover common problems in the site content that negatively affects the site visitor experience. The Site Analysis tool includes a large set of pre-built reports to analyze the sites compliance with SEO recommendations and to discover problems on the site, such as broken links, duplicate resources, or performance issues. The Site Analysis module also supports building custom queries against the data gathered during crawling.
Control how search engines access and display Web content

The Robots Exclusion module enables Web site owners to manage the robots.txt file from within the IIS Manager interface. This file is used to control the indexing of specified URLs, by disallowing search engine crawlers from accessing them. Users have the choice to view their sites using a physical or a logical hierarchal view; and from within that view, they can choose to disallow specific files or folders of the Web application. In addition, users can manually enter a path or modify a selected path, including wildcards. By using a graphical interface, users benefit from having a clear understanding of what sections of the Web site are disallowed and from avoiding any typing mistakes.
Inform search engines about locations that are available for indexing

The Sitemaps and Site Indexes module enables Web site owners to manage the sitemap files and sitemap indexes on the site, application, and folder level to help keep search engines up to date. The Sitemaps and Site Indexes module allows the most important URLs to be listed and ranked in the sitemap.xml file. In addition, the Sitemaps and Site Indexes module helps to ensure the Sitemap.xml file does not contain any broken links.
Site Analysis Features

* Fully featured crawler engine
o Configurable number of concurrent requests to allow users to crawl their Web site without incurring additional processing. This can be configured from 1 to 16 concurrent requests.
o Support for Robots.txt, allowing you to customize the locations where the crawler should analyze and which locations should be ignored.
o Support for Sitemap files allowing you to specify additional locations to be analyzed.
o Support for overriding ‘noindex’ and ‘nofollow’ metatags to allow you to analyze pages to help improve customer experience even when search engines will not process them.
o Configurable limits for analysis, maximum number of URLs to download, and maximum number of kilobytes to download per URL.
o Configurable options for including content from only your directories or the entire site and sub domains.
* View detailed summary of Web site analysis results through a rich dashboard
* Feature rich Query Builder interface that allows you to build custom reports
* Quick access to common tasks
* Display of detailed information for each URL
* View detailed route analysis showing unique routes to better understand the way search engines reach your content

Robots Exclusion Features

* Display of robots content in a friendly user interface
* Support for filtering, grouping, and sorting
* Ability to add ‘disallow’ and ‘allow’ paths using a logical view of your Web site from the result of site analysis processing
* Ability to add sitemap locations

Sitemap and Sitemap Index Features

* Display of sitemaps and sitemap index files in a simple user interface
* Support for grouping and sorting
* Ability to add/edit/remove sitemap and sitemap index files
* Ability to add new URL’s to sitemap and sitemap index files using a physical or logical view of your Web site
* Ability to register a sitemap or sitemap index into the robots exclusion file
http://www.iis.net/expand/SEOToolkit