aiflows.utils package¶
Submodules¶
aiflows.utils.general_helpers module¶
- aiflows.utils.general_helpers.create_unique_id(existing_ids: List[str] | None = None)¶
creates a unique id
- Parameters:
existing_ids (List[str], optional) – A list of existing ids to check against, defaults to None
- Returns:
A unique id
- Return type:
str
- aiflows.utils.general_helpers.encode_from_buffer(buffer)¶
Encodes a buffer (typically an image from a video) to base64.
- aiflows.utils.general_helpers.encode_image(image_path)¶
Encodes an image to base64.
- aiflows.utils.general_helpers.exception_handler(e)¶
Handles an exception.
- Parameters:
e (Exception) – The exception to handle
- aiflows.utils.general_helpers.extract_top_level_function_names(python_file_path)¶
Extracts the top level function names from a python file (ignores nested)
- Parameters:
python_file_path (str) – The path to the python file
- Returns:
A list of function names
- Return type:
List[str]
- aiflows.utils.general_helpers.find_replace_in_dict(cfg, key_to_find, new_value, current_path='')¶
Recursively searches for keys == key_to_find in a dictionary and replaces its value with new_value. note1: it replaces each key == key_to_find, whever it is nested in the dictionary or not. note2: we recommend to only use this function in the Quick Start tutorial, and not in production code.
- Parameters:
cfg (Dict[str, Any]) – The dictionary to search in
key_to_find (str) – The key to find
new_value (Any) – The new value to set
current_path (str, optional) – The current path, defaults to “”
- Returns:
The updated dictionary
- Return type:
Dict[str, Any]
- aiflows.utils.general_helpers.flatten_dict(d, parent_key='', sep='.')¶
Flattens a dictionary.
- Parameters:
d (Dict[str, Any]) – The dictionary to flatten
parent_key (str, optional) – The parent key to use, defaults to ‘’
sep (str, optional) – The separator to use, defaults to ‘.’
- Returns:
The flattened dictionary
- Return type:
Dict[str, Any]
- aiflows.utils.general_helpers.get_current_datetime_ns()¶
Returns the current datetime in nanoseconds.
- Returns:
The current datetime in nanoseconds
- Return type:
int
- aiflows.utils.general_helpers.get_function_from_name(function_name, module)¶
Returns a function from a module given its name.
- aiflows.utils.general_helpers.get_predictions_dir_path(output_dir, create_if_not_exists=True)¶
Returns the path to the predictions folder.
- Parameters:
output_dir (str) – The output directory
create_if_not_exists (bool, optional) – Whether to create the folder if it does not exist, defaults to True
- Returns:
The path to the predictions folder
- Return type:
str
- aiflows.utils.general_helpers.log_suggest_help()¶
Logs a message suggesting to get help or provide feedback on github.
- aiflows.utils.general_helpers.nested_keys_pop(data_dict: dict, nested_key: str) Any ¶
Pop a nested key in a dictionary.
- Parameters:
data_dict (dict) – The dictionary to pop from.
nested_key (str) – The nested key to pop, in the format “key1.key2.key3”.
- Returns:
The value of the popped key.
- Return type:
Any
- aiflows.utils.general_helpers.nested_keys_search(search_dict, nested_key) Tuple[Any, bool] ¶
Searches for a nested key in a dictionary using a composite key string.
- Parameters:
search_dict (dict) – The dictionary to search in.
nested_key (str) – The composite key string to search for.
- Returns:
A tuple containing the value of the nested key and a boolean indicating if the key was found.
- Return type:
Tuple[Any, bool]
- aiflows.utils.general_helpers.nested_keys_update(data_dict: dict, nested_key: str, value: Any) None ¶
Update the value of a nested key in a dictionary.
- Parameters:
data_dict (dict) – The dictionary to update.
nested_key (str) – The nested key to update, in the format “key1.key2.key3”.
value (Any) – The new value to set for the nested key.
- aiflows.utils.general_helpers.process_config_leafs(config: Dict | List, leaf_processor: Callable[[Tuple[Any, Any]], Any])¶
Processes the leafs of a config dictionary or list.
- Parameters:
config (Union[Dict, List]) – The config to process
leaf_processor (Callable[[Tuple[Any, Any]], Any]) – The leaf processor to use
- aiflows.utils.general_helpers.python_file_path_to_module_path(file_path)¶
Converts a python file path to a python module path
- Parameters:
file_path (str) – The python file path
- Returns:
The python module path
- Return type:
str
- aiflows.utils.general_helpers.python_module_path_to_file_path(module_path)¶
Converts a python module path to a python file path
- Parameters:
module_path (str) – The python module path
- Returns:
The python file path
- Return type:
str
- aiflows.utils.general_helpers.read_gzipped_jsonlines(path_to_file)¶
Reads a gzipped jsonlines file and returns a list of dictionaries.
- Parameters:
path_to_file (str) – The path to the gzipped jsonlines file
- Returns:
A list of dictionaries
- Return type:
List[Dict[str, Any]]
- aiflows.utils.general_helpers.read_jsonlines(path_to_file)¶
Reads a jsonlines file and returns a list of dictionaries.
- Parameters:
path_to_file (str) – The path to the jsonlines file
- Returns:
A list of dictionaries
- Return type:
List[Dict[str, Any]]
- aiflows.utils.general_helpers.read_outputs(outputs_dir)¶
Reads the outputs from a jsonlines file.
- Parameters:
outputs_dir (str) – The directory containing the output files
- Returns:
The outputs
- Return type:
List[Dict[str, Any]]
- aiflows.utils.general_helpers.read_yaml_file(path_to_file, resolve=True)¶
Reads a yaml file.
- Parameters:
path_to_file (str) – The path to the yaml file
resolve (bool, optional) – Whether to resolve the config, defaults to True
- Returns:
The config
- Return type:
Dict[str, Any]
- aiflows.utils.general_helpers.recursive_dictionary_update(d, u)¶
Performs a recursive update of the values in dictionary d with the values of dictionary u
- Parameters:
d (Dict[str, Any]) – The dictionary to update
u (Dict[str, Any]) – The dictionary to update with
- Returns:
The updated dictionary
- aiflows.utils.general_helpers.try_except_decorator(f)¶
A decorator that wraps the passed in function in order to handle exceptions and log a message suggesting to get help or provide feedback on github.
- aiflows.utils.general_helpers.unflatten_dict(d, sep='.')¶
Unflattens a dictionary.
- Parameters:
d (Dict[str, Any]) – The dictionary to unflatten
sep (str, optional) – The separator to use, defaults to ‘.’
- Returns:
The unflattened dictionary
- Return type:
Dict[str, Any]
- aiflows.utils.general_helpers.validate_flow_config(cls, flow_config)¶
Validates the flow config.
- Parameters:
cls (class) – The class to validate the flow config for
flow_config (Dict[str, Any]) – The flow config to validate
- Raises:
ValueError – If the flow config is invalid
- aiflows.utils.general_helpers.write_gzipped_jsonlines(path_to_file, data, mode='w')¶
Writes a list of dictionaries to a gzipped jsonlines file.
- Parameters:
path_to_file (str) – The path to the gzipped jsonlines file
data (List[Dict[str, Any]]) – The data to write
mode (str, optional) – The mode to use, defaults to “w”
- aiflows.utils.general_helpers.write_jsonlines(path_to_file, data, mode='w')¶
Writes a list of dictionaries to a jsonlines file.
- Parameters:
path_to_file (str) – The path to the jsonlines file
data (List[Dict[str, Any]]) – The data to write
mode (str, optional) – The mode to use, defaults to “w”
- aiflows.utils.general_helpers.write_outputs(path_to_output_file, summary, mode)¶
Writes the summary to a jsonlines file.
- Parameters:
path_to_output_file (str) – The path to the output file
summary (List[Dict[str, Any]]) – The summary to write
mode (str) – The mode to use
aiflows.utils.io_utils module¶
- aiflows.utils.io_utils.load_pickle(pickle_path: str)¶
Loads data from a pickle file.
- Parameters:
pickle_path (str) – The path to the pickle file
- Returns:
The data loaded from the pickle file
- Return type:
Any
- aiflows.utils.io_utils.recursive_json_serialize(obj)¶
Recursively serializes an object to json.
- Parameters:
obj (Any) – The object to serialize
- Returns:
The serialized object
- Return type:
Any
aiflows.utils.logging module¶
Logging utilities.
- aiflows.utils.logging.add_handler(handler: Handler) None ¶
adds a handler to the Flows’s root logger.
- aiflows.utils.logging.auto_set_dir(action=None, name=None)¶
Use
logger.set_logger_dir()
to set log directory to “./.aiflows/logs/{scriptname}:{name}”. “scriptname” is the name of the main python file currently running- Parameters:
action (str, optional) – an action of [“k”,”d”,”q”] to be performed when the directory exists. When the directory exists, Will ask user by default. -“d”: delete the directory. Note that the deletion may fail when the directory is used by tensorboard. -“k”: keep the directory. This is useful when you resume from a previous training and want the directory to look as if the training was not interrupted. Note that this option does not load old models or any other old states for you. It simply does nothing.
name (str, optional) – The name of the directory
- aiflows.utils.logging.disable_default_handler() None ¶
Disable the default handler of the Flows’s root logger.
- aiflows.utils.logging.disable_propagation() None ¶
Disable propagation of the library log outputs. Note that log propagation is disabled by default.
- aiflows.utils.logging.enable_default_handler() None ¶
Enable the default handler of the Flows’s root logger.
- aiflows.utils.logging.enable_explicit_format() None ¶
Enable explicit formatting for every Flows’s logger. The explicit formatter is as follows:
[LEVELNAME|FILENAME|LINE NUMBER] TIME >> MESSAGE
All handlers currently bound to the root logger are affected by this method.
- aiflows.utils.logging.enable_propagation() None ¶
Enable propagation of the library log outputs. Please disable the Flows’s default handler to prevent double logging if the root logger has been configured.
- aiflows.utils.logging.get_log_levels_dict()¶
Return a dictionary of all available log levels.
- aiflows.utils.logging.get_logger(name: str | None = None) Logger ¶
Return a logger with the specified name. This function is not supposed to be directly accessed unless you are writing a custom aiflows module.
- Parameters:
name (str, optional) – The name of the logger to return
- Returns:
The logger
- aiflows.utils.logging.get_logger_dir()¶
- Returns:
The logger directory, or None if not set. The directory is used for general logging, tensorboard events, checkpoints, etc.
- Return type:
str
- aiflows.utils.logging.get_verbosity() int ¶
Return the current level for the Flows’s root logger as an int.
- Returns:
The logging level
- Return type:
int
Note
Flows has following logging levels:
50: aiflows.logging.CRITICAL or aiflows.logging.FATAL
40: aiflows.logging.ERROR
30: aiflows.logging.WARNING or aiflows.logging.WARN
20: aiflows.logging.INFO
10: aiflows.logging.DEBUG
- aiflows.utils.logging.remove_handler(handler: Handler) None ¶
removes given handler from the Flows’s root logger.
- aiflows.utils.logging.reset_format() None ¶
Resets the formatting for Flows’s loggers. All handlers currently bound to the root logger are affected by this method.
- aiflows.utils.logging.set_dir(dirname, action=None)¶
Set the directory for global logging. :param dirname: log directory :type dirname: str :param action: an action of [“k”,”d”,”q”] to be performed when the directory exists. When the directory exists, Will ask user by default. - “d”: delete the directory. Note that the deletion may fail when the directory is used by tensorboard. - “k”: keep the directory. This is useful when you resume from a previous training and want the directory to look as if the training was not interrupted. Note that this option does not load old models or any other old states for you. It simply does nothing.
- Parameters:
dirname (str) – log directory
action (str, optional) – an action of [“k”,”d”,”q”] to be performed when the directory exists. When the directory exists, Will ask user by default. - “d”: delete the directory. Note that the deletion may fail when the directory is used by tensorboard. - “k”: keep the directory. This is useful when you resume from a previous training and want the directory to look as if the training was not interrupted. Note that this option does not load old models or any other old states for you. It simply does nothing.
- aiflows.utils.logging.set_verbosity(verbosity: int) None ¶
Set the verbosity level for the Flows’s root logger.
- Parameters:
verbosity (int) – Logging level. For example, it can be one of the following: - aiflows.logging.CRITICAL or aiflows.logging.FATAL - aiflows.logging.ERROR - aiflows.logging.WARNING or aiflows.logging.WARN - aiflows.logging.INFO - aiflows.logging.DEBUG
- aiflows.utils.logging.set_verbosity_debug()¶
Set the verbosity to the DEBUG level.
- aiflows.utils.logging.set_verbosity_error()¶
Set the verbosity to the ERROR level.
- aiflows.utils.logging.set_verbosity_info()¶
Set the verbosity to the INFO level.
- aiflows.utils.logging.set_verbosity_warning()¶
Set the verbosity to the WARNING level.
- aiflows.utils.logging.warning_advice(self, *args, **kwargs)¶
This method is identical to logger.warning(), but if env var FLOWS_NO_ADVISORY_WARNINGS=1 is set, this warning will not be printed
- Parameters:
self – The logger object
*args – The arguments to pass to the warning method
**kwargs – The keyword arguments to pass to the warning method
- aiflows.utils.logging.warning_once(self, *args, **kwargs)¶
This method is identical to logger.warning(), but will emit the warning with the same message only once
Note
The cache is for the function arguments, so 2 different callers using the same arguments will hit the cache. The assumption here is that all warning messages are unique across the code. If they aren’t then need to switch to another type of cache that includes the caller frame information in the hashing function.
aiflows.utils.rich_utils module¶
- aiflows.utils.rich_utils.print_config_tree(cfg: DictConfig, print_order: Sequence[str] = [], resolve: bool = False, save_to_file: bool = False) None ¶
Prints content of DictConfig using Rich library and its tree structure.
- Parameters:
cfg (DictConfig) – Configuration composed by Hydra.
print_order (Sequence[str], optional) – Determines in what order config components are printed, defaults to []
resolve (bool, optional) – Whether to resolve reference fields of DictConfig, defaults to False