uu.decode()

uu.decode(in_file, out_file=None, mode=None, quiet=False) This call decodes uuencoded file in_file placing the result on file out_file. If out_file is a pathname, mode is used to set the permission bits if the file must be created. Defaults for out_file and mode are taken from the uuencode header. However, if the file specified in the header already exists, a uu.Error is raised. decode() may print a warning to standard error if the input was produced by an incorrect uuencoder and Python coul

uu.encode()

uu.encode(in_file, out_file, name=None, mode=None) Uuencode file in_file into file out_file. The uuencoded file will have the header specifying name and mode as the defaults for the results of decoding the file. The default defaults are taken from in_file, or '-' and 0o666 respectively.

uuid.getnode()

uuid.getnode() Get the hardware address as a 48-bit positive integer. The first time this runs, it may launch a separate program, which could be quite slow. If all attempts to obtain the hardware address fail, we choose a random 48-bit number with its eighth bit set to 1 as recommended in RFC 4122. “Hardware address” means the MAC address of a network interface, and on a machine with multiple network interfaces the MAC address of any one of them may be returned.

urllib.robotparser.RobotFileParser.modified()

modified() Sets the time the robots.txt file was last fetched to the current time.

urllib.robotparser.RobotFileParser.set_url()

set_url(url) Sets the URL referring to a robots.txt file.

urllib.robotparser.RobotFileParser.mtime()

mtime() Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically.

urllib.robotparser.RobotFileParser.read()

read() Reads the robots.txt URL and feeds it to the parser.

urllib.robotparser.RobotFileParser

class urllib.robotparser.RobotFileParser(url='') This class provides methods to read, parse and answer questions about the robots.txt file at url. set_url(url) Sets the URL referring to a robots.txt file. read() Reads the robots.txt URL and feeds it to the parser. parse(lines) Parses the lines argument. can_fetch(useragent, url) Returns True if the useragent is allowed to fetch the url according to the rules contained in the parsed robots.txt file. mtime() Returns

urllib.robotparser.RobotFileParser.can_fetch()

can_fetch(useragent, url) Returns True if the useragent is allowed to fetch the url according to the rules contained in the parsed robots.txt file.

urllib.robotparser.RobotFileParser.parse()

parse(lines) Parses the lines argument.