tf.contrib.learn.read_batch_features(file_pattern, batch_size, features, reader, randomize_input=True, num_epochs=None, queue_capacity=10000, feature_queue_capacity=100, reader_num_threads=1, parser_num_threads=1, parse_fn=None, name=None)
Adds operations to read, queue, batch and parse Example
protos.
Given file pattern (or list of files), will setup a queue for file names, read Example
proto using provided reader
, use batch queue to create batches of examples of size batch_size
and parse example given features
specification.
All queue runners are added to the queue runners collection, and may be started via start_queue_runners
.
All ops are added to the default graph.
Args:
-
file_pattern
: List of files or pattern of file paths containingExample
records. Seetf.gfile.Glob
for pattern rules. -
batch_size
: An int or scalarTensor
specifying the batch size to use. -
features
: Adict
mapping feature keys toFixedLenFeature
orVarLenFeature
values. -
reader
: A function or class that returns an object withread
method, (filename tensor) -> (example tensor). -
randomize_input
: Whether the input should be randomized. -
num_epochs
: Integer specifying the number of times to read through the dataset. If None, cycles through the dataset forever. NOTE - If specified, creates a variable that must be initialized, so call tf.initialize_local_variables() as shown in the tests. -
queue_capacity
: Capacity for input queue. -
feature_queue_capacity
: Capacity of the parsed features queue. Set this value to a small number, for example 5 if the parsed features are large. -
reader_num_threads
: The number of threads to read examples. -
parser_num_threads
: The number of threads to parse examples. records to read at once -
parse_fn
: Parsing function, takesExample
Tensor returns parsed representation. IfNone
, no parsing is done. -
name
: Name of resulting op.
Returns:
A dict of Tensor
or SparseTensor
objects for each in features
.
Raises:
-
ValueError
: for invalid inputs.
Please login to continue.