Robots.Txt.Parser
1.0.0-rc8
dotnet add package Robots.Txt.Parser --version 1.0.0-rc8
NuGet\Install-Package Robots.Txt.Parser -Version 1.0.0-rc8
<PackageReference Include="Robots.Txt.Parser" Version="1.0.0-rc8" />
paket add Robots.Txt.Parser --version 1.0.0-rc8
#r "nuget: Robots.Txt.Parser, 1.0.0-rc8"
// Install Robots.Txt.Parser as a Cake Addin #addin nuget:?package=Robots.Txt.Parser&version=1.0.0-rc8&prerelease // Install Robots.Txt.Parser as a Cake Tool #tool nuget:?package=Robots.Txt.Parser&version=1.0.0-rc8&prerelease
Table of Contents
Overview
Parse robots.txt and sitemaps using dotnet. Supports the proposed RFC9309 standard, as well as the following common, non-standard directives:
- Sitemap
- Host
- Crawl-delay
Design Considerations
This library is based upon HttpClient
, making it very familiar, easy to use and adaptable to your needs. Since you have full control over the HttpClient
, you are able to configure custom message handlers to intercept outgoing requests and responses. For example, you may want to add custom headers on a request, configure additional logging or set up a retry policy.
Some websites can have very large sitemaps. For this reason, async streaming is supported as the preferred way of parsing sitemaps.
There is also the possibility to extend this library to support protocols other than HTTP, such as FTP.
Features
Name | Supported | Priority |
---|---|---|
HTTP/HTTPS | ✔️ | |
FTPS/FTPS | ❌ | 0.1 |
Wildcard (* ) User-agent |
✔️ | |
Allow & disallow rules | ✔️ | |
End-of-match ($ ) and wildcard (* ) paths |
✔️ | |
Sitemap entries | ✔️ | |
Host directive | ✔️ | |
Crawl-delay directive | ✔️ | |
RSS 2.0 feeds | ❌ | 0.8 |
Atom 0.3/1.0 feeds | ❌ | 0.8 |
Sitemaps XML format | ✔️ | |
Simple text sitemaps | ✔️ | |
Async streaming of sitemaps | ✔️ | |
Cancellation token support | ✔️ | |
Memory management | ✔️ |
Usage
Install the package via NuGet.
dotnet add package Robots.Txt.Parser
Minimal Example
First, create an implementation of IWebsiteMetadata
for the host address that you wish to use.
public class GitHubWebsite : IWebsiteMetadata
{
public static Uri BaseAddress => new("https://www.github.com");
}
Next, create an instance of RobotWebClient<TWebsite>
.
With Dependency Injection
public void ConfigureServices(IServiceCollection services)
{
services.AddHttpClient<IRobotWebClient<GitHubWebsite>, RobotWebClient<GitHubWebsite>>();
}
Without Dependency Injection
using var httpClient = new HttpClient();
var robotWebClient = new RobotWebClient<GitHubWebsite>(httpClient);
Web Crawler Example
Optionally, specify message handlers to modify the HTTP pipeline. For example, you may be attempting to crawl the website and therefore will want to reduce the rate of your requests, to do so responsibly. You can achieve this by adding a custom HttpMessageHandler
to the pipeline.
public class ResponsibleCrawlerHttpClientHandler : DelegatingHandler
{
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
var response = await base.SendAsync(request, cancellationToken);
await Task.Delay(TimeSpan.FromSeconds(1), cancellationToken);
return response;
}
}
With Dependency Injection
public void ConfigureServices(IServiceCollection services)
{
services.TryAddTransient<ResponsibleCrawlerHttpClientHandler>();
services.AddHttpClient<IRobotWebClient<GitHubWebsite>, RobotWebClient<GitHubWebsite>>()
.AddPrimaryHttpMessageHandler<ResponsibleCrawlerHttpClientHandler>();
}
Without Dependency Injection
var httpClientHandler = new HttpClientHandler
{
InnerHandler = new ResponsibleCrawlerHttpClientHandler()
};
using var httpClient = new HttpClient(httpClientHandler);
var robotWebClient = new RobotWebClient<GitHubWebsite>(httpClient);
Retrieving the Sitemap
var robotsTxt = await robotWebClient.LoadRobotsTxtAsync();
// providing a datetime only retrieves sitemap items modified since this datetime
var modifiedSince = new DateTime(2023, 01, 01);
// sitemaps are iterated asynchronously
// even if robots.txt does not contain sitemap directive, looks for a sitemap at {TWebsite.BaseAddress}/sitemap.xml
await foreach(var item in robotsTxt.LoadSitemapAsync(modifiedSince))
{
}
Checking a Rule
var robotsTxt = await robotWebClient.LoadRobotsTxtAsync();
// if rules for the specific robot are not present, it falls back to the wildcard *
var anyRulesDefined = robotsTxt.TryGetRules(ProductToken.Parse("SomeBot"), out var rules);
// even if no wildcard rules exist, an empty rule-checker is returned
var isAllowed = rules.IsAllowed("/some/path");
Getting Preferred Host
var robotsTxt = await robotWebClient.LoadRobotsTxtAsync();
// host value will fall back to TWebsite.BaseAddress host, if no directive exists
var hasHostDirective = robotsTxt.TryGetHost(out var host);
Getting Crawl Delay
var robotsTxt = await robotWebClient.LoadRobotsTxtAsync();
// if rules for the specific robot are not present, it falls back to the wildcard *
// if no Crawl-delay directive exists, crawl delay will be 0
var hasCrawlDelayDirective = robotsTxt.TryGetCrawlDelay(ProductToken.Parse("SomeBot"), out var crawlDelay);
Contributing
Issues and pull requests are encouraged. For large or breaking changes, it is suggested to open an issue first, to discuss before proceeding.
If you find this project useful, please give it a star.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
-
net7.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.