w10schools
Login
Register
In current category
Only my posts
Home
Tutorials
HTML, CSS
HTML, HTML5
CSS, CSS3
XML
JavaScript, AJAX
PHP
Ruby on Rails
Ruby
Rails
ASP.NET
ASP
C# .NET
Java
JSP
Databases
MySQL
MSSQL Server
Oracle
Web Server
Windows
Linux
Apache
IIS
Design
Dreamweaver
Flash
Photoshop
Fireworks
CorelDRAW
Illustrator
References
HTML
Bootstrap
CSS
CSS Selectors
Sass
JavaScript
Mobile Apps
Cordova
Ruby on Rails
Ruby
Rails
PHP
Function
Language
CodeIgniter
Phalcon
PHPUnit
Symfony
Drupal
Yii
Laravel
Perl
Perl
Python
Python
Django
NumPy
Pandas
scikit-image
scikit-learn
Statsmodels
Matplotlib
Lua
Lua
C, C++
C
C++
Server
Docker
Apache HTTP Server
Nginx
Database
PostgreSQL
Big Data
TensorFlow
Game Development
Phaser
LÖVE
Articles
News
General
Website Promotion
Online Advertising
Make Money Online
Editorials
Interviews
Web Roundups
Resources
Design
Fonts
Graphics
Vectors
Templates
UI (User Interface)
Brushes
Patterns, Backgrounds
Textures
Web Development
HTML, CSS
XML
JavaScript, AJAX
Ruby on Rails
PHP
Java
.NET
Python
Perl
Flash
Mobile Application Development
Mobile Development Tools
Swift
Online Services
Online Storage
Web Hosting
Tools
Generators
Edit
Toggle Dropdown
In Place Editing
Advanced Editing
References
Python
Python
Internet
Language
Default
Markdown
urllib.robotparser.RobotFileParser.read()
read()
Reads the
robots.txt
URL and feeds it to the parser.
Links:
https://docs.python.org/3.5/library/urllib.robotparser.html#urllib.robotparser.RobotFileParser.read
doc_python
2016-10-07 17:47:06
Comments
Leave a Comment
Please
login
to continue.
Popular Articles
urllib.robotparser.RobotFileParser
class urllib.robotparser.RobotFileParser(url='') This class provides methods to read, parse and answer questions about the robots.txt file at url.
urllib.robotparser.RobotFileParser.modified()
modified() Sets the time the robots.txt file was last fetched to the current time.
urllib.robotparser.RobotFileParser.can_fetch()
can_fetch(useragent, url) Returns True if the useragent is allowed to fetch the url according to the rules contained in the parsed robots.txt file
urllib.robotparser.RobotFileParser.parse()
parse(lines) Parses the lines argument.
urllib.robotparser.RobotFileParser.mtime()
mtime() Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt f
Return to View
Please login to continue.