Python API
Contents
API for writing service templates
See here for an overview of the TOSCA Python DSL.
TOSCA Field Specifiers
The follow are functions that are used as field specified when declaring attributes on TOSCA type. Use these if you need to specify TOSCA specific information about the field or if the TOSCA field type can’t be inferred from the Python’s attribute’s type. For example:
class MyNode(tosca.nodes.Root):
a_tosca_property: str = Property(name="a-tosca-property", default=None, metadata={"foo": "bar"})
Note that these functions all take keyword-only parameters (this is needed for IDE integration).
- tosca.Artifact(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', metadata=None, options=None)
Field specifier for declaring a TOSCA artifact.
- Parameters
default (Any, optional) – Default value. Set to None if the artifact isn’t required. Defaults to MISSING.
factory (Callable, optional) – Factory function to initialize the artifact with a unique value per template. Defaults to MISSING.
name (str, optional) – TOSCA name of the field, overrides the artifact’s name when generating YAML. Defaults to “”.
metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the artifact.
options (Options, optional) – Additional typed metadata to merge into metadata.
- Return type
Any
- tosca.Attribute(*, default=None, factory=<dataclasses._MISSING_TYPE object>, name='', constraints=None, metadata=None, title='', status='', options=None, init=False)
Field specifier for declaring a TOSCA attribute.
- Parameters
default (Any, optional) – Default value. Set to None if the attribute isn’t required. Defaults to MISSING.
factory (Callable, optional) – Factory function to initialize the attribute with a unique value per template. Defaults to MISSING.
name (str, optional) – TOSCA name of the field, overrides the attribute’s name when generating YAML. Defaults to “”.
metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the attribute.
options (Options, optional) – Additional typed metadata to merge into metadata.
constraints (List[
DataConstraint
], optional) – List of TOSCA property constraints to apply to the attribute.title (str, optional) – Human-friendly alternative name of the attribute.
status (str, optional) – TOSCA status of the attribute.
init (Literal[False]) –
- Return type
Any
- tosca.Capability(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', metadata=None, options=None, valid_source_types=None)
Field specifier for declaring a TOSCA capability.
- Parameters
default (Any, optional) – Default value. Set to None if the capability isn’t required. Defaults to MISSING.
factory (Callable, optional) – Factory function to initialize the capability with a unique value per template. Defaults to MISSING.
name (str, optional) – TOSCA name of the field, overrides the capability’s name when generating YAML. Defaults to “”.
metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the capability.
options (Options, optional) – Additional typed metadata to merge into metadata.
valid_source_types (List[str], optional) – List of TOSCA type names to set as the capability’s valid_source_types
- Return type
Any
- tosca.Computed(name='', *, factory, metadata=None, title='', status='', options=None, attribute=False)
Field specifier for declaring a TOSCA property whose value is computed by the factory function at runtime.
- Parameters
factory (function) – function called at runtime every time the property is evaluated.
name (str, optional) – TOSCA name of the field, overrides the Python name when generating YAML.
metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the property.
title (str, optional) – Human-friendly alternative name of the property.
status (str, optional) – TOSCA status of the property.
options (Options, optional) – Typed metadata to apply.
attribute (bool, optional) – Indicate that the property is also a TOSCA attribute.
- Return type
tosca._tosca.RT
- Return type:
The return type of the factory function (should be compatible with the field type).
- tosca.Property(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', constraints=None, metadata=None, title='', status='', options=None, attribute=False)
Field specifier for declaring a TOSCA property.
- Parameters
default (Any, optional) – Default value. Set to None if the property isn’t required. Defaults to MISSING.
factory (Callable, optional) – Factory function to initialize the property with a unique value per template. Defaults to MISSING.
name (str, optional) – TOSCA name of the field, overrides the property’s name when generating YAML. Defaults to “”.
metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the property.
options (Options, optional) – Additional typed metadata to merge into metadata.
constraints (List[
DataConstraint
], optional) – List of TOSCA property constraints to apply to the property.title (str, optional) – Human-friendly alternative name of the property.
status (str, optional) – TOSCA status of the property.
attribute (bool, optional) – Indicate that the property is also a TOSCA attribute. Defaults to False.
- Return type
Any
- tosca.Requirement(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', metadata=None, options=None, relationship=None, capability=None, node=None, node_filter=None)
Field specifier for declaring a TOSCA requirement.
- Parameters
default (Any, optional) – Default value. Set to None if the requirement isn’t required. Defaults to MISSING.
factory (Callable, optional) – Factory function to initialize the requirement with a unique value per template. Defaults to MISSING.
name (str, optional) – TOSCA name of the field, overrides the requirement’s name when generating YAML. Defaults to “”.
metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the requirement.
options (Options, optional) – Additional typed metadata to merge into metadata.
relationship (str | Type[Relationship], optional) – The requirement’s
relationship
specified by TOSCA type name or Relationship class.capability (str | Type[CapabilityType], optional) – The requirement’s
capability
specified by TOSCA type name or CapabilityType class.node (str, | Type[Node], optional) – The requirement’s
node
specified by TOSCA type name or Node class.node_filter (Dict[str, Any], optional) – The TOSCA node_filter for this requirement.
- Return type
Any
- tosca.operation(name='', apply_to=None, timeout=None, operation_host=None, environment=None, dependencies=None, outputs=None, entry_state=None, invoke=None)
Function decorator that marks a function or methods as a TOSCA operation.
- Parameters
name (str, optional) – Name of the TOSCA operation. Defaults to the name of the method.
apply_to (Sequence[str], optional) – List of TOSCA operations to apply this method to. If omitted, match by the operation name.
timeout (float, optional) – Timeout for the operation (in seconds). Defaults to None.
operation_host (str, optional) – The name of host where this operation will be executed. Defaults to None.
environment (Dict[str, str], optional) – A map of environment variables to use while executing the operation. Defaults to None.
dependencies (List[Union[str, Dict[str, Any]]], optional) – List of artifacts this operation depends on. Defaults to None.
outputs (Dict[str, str], optional) – TOSCA outputs mapping. Defaults to None.
entry_state (str, optional) – Node state required to invoke this operation. Defaults to None.
invoke (str, optional) – Name of operation to delegate this operation to. Defaults to None.
- Return type
Callable[[Callable], Callable]
This example marks a method a implementing the
create
anddelete
operations on theStandard
TOSCA interface.@operation(apply_to=["Standard.create", "Standard.delete"]) def default(self): return self.my_artifact.execute()
TOSCA Types
- class tosca.ToscaType(_name='', *, _metadata=<factory>)
Base class for TOSCA type definitions.
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
- Return type
None
- find_configured_by(field_name: str | FieldProjection)
Transitively search for
field_name
along the.configured_by
axis (see Special keys) and return the first match.For example:
class A(Node): pass class B(Node): url: str connects_to: A = tosca.Requirement(relationship=unfurl.relationships.Configures) a = A() b = B(connects_to=a, url="https://example.com") >>> a.find_configured_by(B.url) "https://example.com"
If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will evaluate in the current context.
- Parameters
field_name (str | FieldProjection) – Either the name of the field, or for a more type safety, a reference to the field (e.g.
B.url
in the example above).- Returns
The value of the referenced field
- Return type
Any
- find_hosted_on(field_name: str | FieldProjection)
Transitively search for
field_name
along the.hosted_on
axis (see Special keys) and return the first match.class A(Node): url: str class B(Node): host: A = tosca.Requirement(relationship=tosca.relationships.HostedOn) a = A(url="https://example.com") b = B(host=a) >>> b.find_hosted_on(A.url) "https://example.com"
If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will evaluate in the current context.
- Parameters
field_name (str | FieldProjection) – Either the name of the field, or for a more type safety, a reference to the field (e.g.
A.url
in the example above).- Returns
The value of the referenced field
- Return type
Any
- classmethod set_to_property_source(requirement, property)
Sets the given requirement to the TOSCA template that provided the value of “property”.
For example, if
A.property = B.property
thenA.set_to_property_source("requirement", "property")
will create a node filter forA.requirement
that selectsB
.The requirement and property have to be defined on the same class. The method should be called from
_class_init(cls)
.- Parameters
requirement (FieldProjection or str) – name of the requirement field
property (FieldProjection or str) – name of the property field
- Raises
TypeError – If
requirement
orproperty
are missing fromcls
.- Return type
None
The requirement and property names can also be strings, e.g.:
cls.set_to_property_source("requirement", "property")
Note that
cls.set_to_property_source(cls.requirement, cls.property)
is equivalent to:
cls.requirement = cls.property
if called within_class_init(cls)
,but using this method will avoid static type checker complaints.
- set_operation(op, name=None)
Assign the given TOSCA operation to this TOSCA object. TOSCA allows operations to be defined directly on templates.
- Parameters
op (Callable[[Concatenate[tosca._tosca.ToscaType, ...]], Any]) – A function implements the operation. It should looks like a method, i.e. accepts
Self
as the first argument. Using thetosca.operation
function decorator is recommended but not required.name (Optional[str]) – The TOSCA operation name. If omitted,
op
’soperation_name
or function name is used.
- Return type
None
- class tosca.Node(_name='', *, _metadata=<factory>, _directives=<factory>, _node_filter=None)
A TOSCA node template.
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
_directives (List[str]) –
_node_filter (Optional[Dict[str, Any]]) –
- Return type
None
- find_required_by(requirement_name: str | FieldProjection, expected_type: Type[Node] | None = None)
Finds the node template with a requirement named
requirement_name
whose value is this template.For example:
class A(Node): pass class B(Node): connects_to: A a = A() b = B(connects_to=a) >>> a.find_required_by(B.connects_to, B) b
If no match is found, or more than one match is found, an error is raised. If 0 or more matches are expected, use
find_all_required_by
.If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will evaluate in the current context.
For example, to expand on the example above:
class A(Node): parent: B = find_required_by(B.connects_to, B)
parent
will default to an eval expression.- Parameters
requirement_name (str | FieldProjection) – Either the name of the req, or for a more type safety, a reference to the requirement (e.g.
B.connects_to
in the example above).expected_type (Node, optional) – The expected type of the node template will be returned. If provided, enables static typing and runtime validation of the return value.
- Returns
The node template that is targeting this template via the requirement.
- Return type
- find_all_required_by(requirement_name: str | FieldProjection, expected_type: Type[Node] | None = None)
Behaves the same as
find_required_by
but returns a list of all the matches found. If no match is found, return an empty list.- Parameters
requirement_name (str | FieldProjection) – Either the name of the req, or for a more type safety, a reference to the requirement (e.g.
B.connects_to
in the example above).expected_type (Node, optional) – The expected type of the node template will be returned. If provided, enables static typing and runtime validation of the return value.
- Return type
List[tosca.Node]
- class tosca.Relationship(_name='', *, _metadata=<factory>, _local_name=None, _node=None, _default_for=None, _target=None)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
_local_name (Optional[str]) –
_node (Optional[tosca._tosca.Node]) –
_default_for (Optional[str]) –
_target (tosca._tosca.Node) –
- Return type
None
- class tosca.CapabilityEntity(_name='', *, _metadata=<factory>, _local_name=None, _node=None)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
_local_name (Optional[str]) –
_node (Optional[tosca._tosca.Node]) –
- Return type
None
- class tosca.DataEntity(_name='', *, _metadata=<factory>, _local_name=None, _node=None)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
_local_name (Optional[str]) –
_node (Optional[tosca._tosca.Node]) –
- Return type
None
- class tosca.ArtifactEntity(_name='', *, _metadata=<factory>, _local_name=None, _node=None, file, repository=None, deploy_path=None, version=None, checksum=None, checksum_algorithm=None, permissions=None, intent=None, target=None, order=None, contents=None)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
_local_name (Optional[str]) –
_node (Optional[tosca._tosca.Node]) –
file (str) –
repository (Optional[str]) –
deploy_path (Optional[str]) –
version (Optional[str]) –
checksum (Optional[str]) –
checksum_algorithm (Optional[str]) –
permissions (Optional[str]) –
intent (Optional[str]) –
target (Optional[str]) –
order (Optional[int]) –
contents (Optional[str]) –
- Return type
None
- class tosca.Interface(_name='', *, _metadata=<factory>)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
- Return type
None
- class tosca.Group(_name='', *, _metadata=<factory>)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
- Return type
None
- class tosca.Policy(_name='', *, _metadata=<factory>)
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
- Return type
None
Other
- tosca.Eval(value)
Use this function to specify that a value is or contains a TOSCA function or eval expressions. For example, for property default values.
- Parameters
value (Any) –
- Return type
Any
- class tosca.DataConstraint(constraint)
Base class for TOSCA property constraints. A subclass exists for each of those constraints.
These can be passed as
Property
andAttribute
field specifiers or as a Python type annotations.- Parameters
constraint (tosca._tosca.T) –
- class tosca.NodeTemplateDirective(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)
Node Template directives.
- check = 'check'
Run check operation before deploying
- default = 'default'
Only use this template if one with the same name isn’t already defined in the root topology.
- dependent = 'dependent'
Exclude from plan generation
- discover = 'discover'
Discover (instead of create)
- protected = 'protected'
Don’t delete.
- select = 'select'
Match with instance in external ensemble
- substitute = 'substitute'
Create a nested topology
- virtual = 'virtual'
Don’t instantiate
- class tosca.TopologyInputs(_name='', *, _metadata=<factory>)
Base class for defining topology template inputs.
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
- Return type
None
- class tosca.TopologyOutputs(_name='', *, _metadata=<factory>)
Base class for defining topology template outputs.
- Parameters
_name (str) –
_metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –
- Return type
None
- tosca.set_evaluation_mode(mode)
A context manager that sets the global (per-thread) tosca evaluation mode and restores the previous mode upon exit. This is only needed for testing or other special contexts.
- Parameters
mode (str) – “spec”, “yaml”, or “runtime”
- Yields
the previous mode
with set_mode("spec"): assert tosca.global_state.mode == "spec"
- tosca.safe_mode()
This function returns True if running within the Python safe mode sandbox.
- Return type
bool
- tosca.global_state_mode()
This function returns the execution state (either “spec” or “runtime”) that the current thread is in.
- Return type
str
- tosca.global_state_context()
This function returns orchestrator-specific runtime state for the current thread (or None).
- Return type
Any
Utility Functions
This module contains utility functions that can be executed in “spec” mode (e.g. as part of a class definition or in _class_init_
)
and in the safe mode Python sandbox.
Each of these are also available as Eval Expression Functions.
- unfurl.tosca_plugins.functions.to_dns_label(arg: Union[str, list], *, allowed="'[a-zA-Z0-9-]'", start="'[a-zA-Z]'", replace="'--'", case="'lower'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) str
- unfurl.tosca_plugins.functions.to_dns_label(arg: Mapping, *, allowed="'[a-zA-Z0-9-]'", start="'[a-zA-Z]'", replace="'--'", case="'lower'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) dict
Convert the given argument (see
unfurl.tosca_plugins.functions.to_label()
for full description) to a DNS label (a label is the name separated by “.” in a domain name). The maximum length of each label is 63 characters and can include alphanumeric characters and hyphens but a domain name must not commence or end with a hyphen.Invalid characters are replaced with “–“.
- unfurl.tosca_plugins.functions.to_googlecloud_label(arg: Union[str, list], *, allowed="'\\\\w-'", case="'lower'", replace="'__'", start="'[a-zA-Z]'", max='63', **kw: Unpack[DNSLabelKwArgs]) str
- unfurl.tosca_plugins.functions.to_googlecloud_label(arg: Mapping, *, allowed="'\\\\w-'", case="'lower'", replace="'__'", start="'[a-zA-Z]'", max='63', **kw: Unpack[DNSLabelKwArgs]) dict
See https://cloud.google.com/resource-manager/docs/labels-overview
Invalid characters are replaced with “__”.
- unfurl.tosca_plugins.functions.to_kubernetes_label(arg: Union[str, list], *, allowed="'\\\\w.-'", case="'any'", replace="'__'", start="'[a-zA-Z0-9]'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) str
- unfurl.tosca_plugins.functions.to_kubernetes_label(arg: Mapping, *, allowed="'\\\\w.-'", case="'any'", replace="'__'", start="'[a-zA-Z0-9]'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) dict
See https://kubernetes.io/docs/concepts/overview/working-with-objects/labels/#syntax-and-character-set
Invalid characters are replaced with “__”.
- unfurl.tosca_plugins.functions.to_label(arg: Union[str, list], **kw: Unpack[LabelKwArgs]) str
- unfurl.tosca_plugins.functions.to_label(arg: Mapping, **kw: Unpack[LabelKwArgs]) dict
- Convert a string to a label with the given constraints.
If a dictionary, all keys and string values are converted. If list, to_label is applied to each item and concatenated using
sep
- Parameters
arg (str or dict or list) – Convert to label
allowed (str, optional) – Allowed characters. Regex character ranges and character classes. Defaults to “w” (equivalent to
[a-zA-Z0-9_]
)replace (str, optional) – String Invalidate. Defaults to “” (remove the characters).
start (str, optional) – Allowed characters for the first character. Regex character ranges and character classes. Defaults to “a-zA-Z”
start_prepend (str, optional) – If the start character is invalid, prepend with this string (Default: “x”)
end (str, optional) – Allowed trailing characters. Regex character ranges and character classes. If set, invalid characters are stripped.
max (int, optional) – max length of label. Defaults to 63 (the maximum for a DNS name).
case (str, optional) – “upper”, “lower” or “any” (no conversion). Defaults to “any”.
sep (str, optional) – Separator to use when concatenating a list. Defaults to “”
digestlen (int, optional) – If a label is truncated, the length of the digest to include in the label. 0 to disable. Default: 3 or 2 if max < 32
- unfurl.tosca_plugins.functions.urljoin(scheme, host, port=None, path=None, query=None, frag=None)
Evaluate a list of url components to a relative or absolute URL, where the list is
[scheme, host, port, path, query, fragment]
.The list must have at least two items (
scheme
andhost
) present but if either or both are empty a relative or scheme-relative URL is generated. If all items are empty,null
is returned. Thepath
,query
, andfragment
items are url-escaped if present. Default ports (80 and 443 forhttp
andhttps
URLs respectively) are omitted even if specified.- Parameters
scheme (str) –
host (str) –
port (Optional[Union[str, int]]) –
path (Optional[str]) –
query (Optional[str]) –
frag (Optional[str]) –
- Return type
Optional[str]
Eval Expression Functions
Type-safe equivalents to Unfurl’s Eval Expression Functions.
When called in “spec” mode (e.g. as part of a class definition or in _class_init_
) they will return eval expression
that will get executed. But note that the type signature will match the result of the expression, not the eval expression itself.
(This type punning enables effective static type checking).
When called in runtime mode (ie. as a computed property or as operation implementation) they perform the equivalent functionality.
These functions can be executed in the safe mode Python sandbox as it always executes in “spec” mode.
Note that some functions are overloaded with two signatures,
One that takes a live ToscaType object as an argument and one that takes None
in its place.
The former variant can only be used in runtime mode as live objects are not available outside that mode. In “spec” mode, the None variant must be used and at runtime the eval expression returned by that function will be evaluated using the current context’s instance.
- unfurl.tosca_plugins.expr.abspath(obj: ToscaType, path: str, relativeTo=None, mkdir=False) FilePath
- unfurl.tosca_plugins.expr.abspath(obj: None, path: str, relativeTo=None, mkdir=False) str
- unfurl.tosca_plugins.expr.and_expr(left, right)
- Parameters
left (unfurl.tosca_plugins.expr.T) –
right (unfurl.tosca_plugins.expr.U) –
- Return type
Union[unfurl.tosca_plugins.expr.T, unfurl.tosca_plugins.expr.U]
- unfurl.tosca_plugins.expr.as_bool(val)
- Return type
bool
- unfurl.tosca_plugins.expr.concat(*args, sep='')
- Parameters
args (str) –
- Return type
str
- unfurl.tosca_plugins.expr.get_dir(obj: ToscaType, relativeTo=None, mkdir=False) FilePath
- unfurl.tosca_plugins.expr.get_dir(obj: None, relativeTo=None, mkdir=False) str
- unfurl.tosca_plugins.expr.get_ensemble_metadata(key: None = None) Dict[str, str]
- unfurl.tosca_plugins.expr.get_ensemble_metadata(key: str) str
- unfurl.tosca_plugins.expr.get_env(name: str, default: str, *, ctx='None') str
- unfurl.tosca_plugins.expr.get_env(name: str, *, ctx='None') Optional[str]
- unfurl.tosca_plugins.expr.get_env(*, ctx='None') Dict[str, str]
- unfurl.tosca_plugins.expr.get_input(name: str, default: TI) TI
- unfurl.tosca_plugins.expr.get_input(name: str) Any
- unfurl.tosca_plugins.expr.get_nodes_of_type(cls)
- Parameters
cls (Type[tosca._tosca.ToscaType]) –
- Return type
list
- unfurl.tosca_plugins.expr.has_env(name)
- Parameters
name (str) –
- Return type
bool
- unfurl.tosca_plugins.expr.if_expr(if_cond, then, otherwise=None)
Returns an eval expression like:
{“eval”: {“if”: if_cond, “then”: then, “else”: otherwise}
This will not evaluate at runtime mode because all arguments will evaluated before calling this function, defeating eval expressions’ (and Python’s) short-circuit semantics. To avoid unexpected behavior, an error will be raised if invoked during runtime mode. Instead just use a Python ‘if’ statement or expression.
- Parameters
then (unfurl.tosca_plugins.expr.T) –
otherwise (unfurl.tosca_plugins.expr.U) –
- Return type
Union[unfurl.tosca_plugins.expr.T, unfurl.tosca_plugins.expr.U]
- unfurl.tosca_plugins.expr.lookup(name, *args, **kwargs)
- Parameters
name (str) –
- unfurl.tosca_plugins.expr.negate(val)
- Return type
bool
- unfurl.tosca_plugins.expr.or_expr(left, right)
- Parameters
left (unfurl.tosca_plugins.expr.T) –
right (unfurl.tosca_plugins.expr.U) –
- Return type
Union[unfurl.tosca_plugins.expr.T, unfurl.tosca_plugins.expr.U]
- unfurl.tosca_plugins.expr.tempfile(contents, suffix='', encoding=None)
- Parameters
contents (Any) –
- unfurl.tosca_plugins.expr.template(obj, *, path='', contents='', overrides=None)
- Parameters
obj (Optional[tosca._tosca.ToscaType]) –
path (str) –
contents (str) –
overrides (Optional[Dict[str, str]]) –
- Return type
Any
- unfurl.tosca_plugins.expr.to_env(args, update_os_environ=False)
- Parameters
args (Dict[str, str]) –
- Return type
Dict[str, str]
- unfurl.tosca_plugins.expr.token(string, token, index)
- Parameters
string (str) –
token (str) –
index (int) –
- Return type
str
- unfurl.tosca_plugins.expr.uri(obj=None)
- Parameters
obj (Optional[tosca._tosca.ToscaType]) –
- Return type
Optional[str]
API for writing configurators
- class unfurl.configurator.Configurator(*args, **kw)
Base class for implementing Configurators. Subclasses should at least implement a
run()
, for example:class MinimalConfigurator(Configurator): def run(self, task): assert self.can_run(task) return Status.ok
- Parameters
args (tosca._tosca.ToscaInputs) –
- Return type
None
- attribute_output_metadata_key: Optional[str] = None
- can_dry_run(task)
Returns whether this configurator can handle a dry-runs for the given task. (And should check
TaskView.dry_run
in during run().- Parameters
task (
TaskView
) – The task about to be run.- Returns
bool
- Return type
bool
- can_run(task)
Return whether or not the configurator can execute the given task depending on if this configurator support the requested action and parameters and given the current state of the target instance?
- Parameters
task (
TaskView
) – The task that is about to be run.- Returns
Should return True or a message describing why the task couldn’t be run.
- Return type
(bool or str)
- check_digest(task, changeset)
Examine the previous
ChangeRecord
generated by the previous time this operation was performed on the target instance and return whether it should be rerun or not.The default implementation recalculates the digest of input parameters that were accessed in the previous run.
- Parameters
task (
TaskView
) – The task that might execute this operation.changeset (
ChangeRecordRecord
) – The task that might execute this operation.
- Returns
True if configuration’s digest has changed, False if it is the same.
- Return type
bool
- exclude_from_digest: Tuple[str, ...] = ()
- classmethod get_dry_run(inputs, template)
- Parameters
template (unfurl.spec.EntitySpec) –
- Return type
bool
- get_generator(task)
- Parameters
task (unfurl.configurator.TaskView) –
- Return type
Generator
- render(task)
This method is called during the planning phase to give the configurator an opportunity to do early validation and error detection and generate any plan information or configuration files that the user may want to review before the running the deployment task.
Property access and writes will be tracked and used to establish dynamic dependencies between instances so the plan can be ordered properly. Any updates made to instances maybe reverted if it has dependencies on attributes that might be changed later in the plan, so this method should be idempotent.
- Returns
The value returned here will subsequently be available as
task.rendered
- Parameters
task (unfurl.configurator.TaskView) –
- Return type
Any
- run(task)
Subclasses of Configurator need to implement this method. It should perform the operation specified in the
ConfigurationSpec
on thetask.target
. It can be either a generator that yields one or moreJobRequest
orTaskRequest
or a regular function.- Parameters
task (
TaskView
) – The task currently running.- Yields
Optionally
run
can yield either aJobRequest
,TaskRequest
to run subtasks and finally aConfiguratorResult
when done- Returns
If
run
is not defined as a generator it must return either aStatus
, abool
or aConfiguratorResult
to indicate if the task succeeded and any changes to the target’s state.- Return type
Union[Generator, unfurl.configurator.ConfiguratorResult, unfurl.support.Status, bool]
- save_digest(task)
Generate a compact, deterministic representation of the current configuration. This is saved in the job log and used by
check_digest
in subsequent jobs to determine if the configuration changed the operation needs to be re-run.The default implementation calculates a SHA1 digest of the values of the inputs that where accessed while that task was run, with the exception of the input parameters listed in
exclude_from_digest
.- Parameters
task (
TaskView
) –- Returns
A dictionary whose keys are strings that start with “digest”
- Return type
dict
- classmethod set_config_spec_args(kw, template)
- Parameters
kw (dict) –
template (unfurl.spec.EntitySpec) –
- Return type
dict
- short_name: Optional[str] = None
shortName can be used to customize the “short name” of the configurator as an alternative to using the full name (“module.class”) when setting the implementation on an operation. (Titlecase recommended)
- should_run(task)
Does this configuration need to be run?
- Parameters
task (unfurl.configurator.TaskView) –
- Return type
Union[bool, unfurl.support.Priority]
- class unfurl.configurator.ConfiguratorResult(success, modified, status=None, result=None, outputs=None, exception=None)
Represents the result of a task that ran.
See
TaskView.done()
for more documentation.- Parameters
success (bool) –
modified (Optional[bool]) –
status (Optional[unfurl.support.Status]) –
result (Optional[Union[dict, str]]) –
outputs (Optional[dict]) –
exception (Optional[unfurl.util.UnfurlTaskError]) –
- Return type
None
- class unfurl.configurator.JobRequest(resources, errors=None, update=None)
Yield this to run a child job.
- Parameters
resources (List[unfurl.runtime.EntityInstance]) –
errors (Optional[Sequence[unfurl.util.UnfurlError]]) –
- get_instance_specs()
- property name
- property root
- set_error(msg)
- Parameters
msg (str) –
- property target
- class unfurl.configurator.TaskRequest(configSpec, target, reason, persist=False, required=None, startState=None)
Yield this to run a child task. (see
unfurl.configurator.TaskView.create_sub_task()
)- Parameters
configSpec (unfurl.planrequests.ConfigurationSpec) –
target (unfurl.runtime.EntityInstance) –
reason (str) –
persist (bool) –
required (Optional[bool]) –
startState (Optional[unfurl.support.NodeState]) –
- property completed: bool
bool(x) -> bool
Returns True when the argument x is true, False otherwise. The builtins True and False are the only two instances of the class bool. The class bool is a subclass of the class int, and cannot be subclassed.
- finish_workflow()
- Return type
None
- get_operation_artifacts()
- Return type
- property name
- reassign_final_for_workflow()
- Return type
Optional[unfurl.planrequests.TaskRequest]
- class unfurl.configurator.TaskView(manifest, configSpec, target, reason=None, dependencies=None)
The interface presented to configurators.
The following public attributes are available:
- Parameters
manifest (Manifest) –
configSpec (unfurl.planrequests.ConfigurationSpec) –
target (unfurl.runtime.EntityInstance) –
reason (Optional[str]) –
dependencies (Optional[List[Operational]]) –
- Return type
None
- target
The instance this task is operating on.
- cwd
Current working directory
- Type
str
- dry_run
Dry run only
- Type
bool
- verbose
Verbosity level set for this job (-1 error, 0 normal, 1 verbose, 2 debug)
- Type
int
- add_dependency(expr, expected=None, schema=None, name=None, required=True, wantList=False, target=None, write_only=None)
- Parameters
expr (Union[str, collections.abc.Mapping]) –
expected (Optional[Union[list, unfurl.result.ResultsList, unfurl.result.Result]]) –
schema (Optional[collections.abc.Mapping]) –
name (Optional[str]) –
required (bool) –
wantList (bool) –
target (Optional[unfurl.runtime.EntityInstance]) –
write_only (Optional[bool]) –
- Return type
unfurl.configurator.Dependency
- add_message(message)
- Parameters
message (object) –
- Return type
None
- apply_work_folders(*names)
- Parameters
names (str) –
- Return type
None
- property connections: unfurl.configurator._ConnectionsMap
- create_sub_task(operation=None, resource=None, inputs=None, persist=False, required=None)
Create a subtask that will be executed if yielded by
unfurl.configurator.Configurator.run()
- Parameters
operation (str) – The operation call (like
interface.operation
)resource (
NodeInstance
) –inputs (Optional[dict]) –
persist (bool) –
required (Optional[bool]) –
- Returns
- Return type
Optional[unfurl.planrequests.TaskRequest]
- discard_work_folders()
- Return type
None
- done(success=None, modified=None, status=None, result=None, outputs=None, captureException=None)
unfurl.configurator.Configurator.run()
should call this method and return or yield its return value before terminating.>>> yield task.done(True)
- Parameters
success (bool) – indicates if this operation completed without an error.
modified (bool) – (optional) indicates whether the physical instance was modified by this operation.
status (Status) – (optional) should be set if the operation changed the operational status of the target instance. If not specified, the runtime will updated the instance status as needed, based the operation preformed and observed changes to the instance (attributes changed).
result (dict) – (optional) A dictionary that will be serialized as YAML into the changelog, can contain any useful data about these operation.
outputs (dict) – (optional) Operation outputs, as specified in the topology template.
captureException (Optional[object]) –
- Returns
- Return type
- property environ: Dict[str, str]
- fail_work_folders()
- Return type
None
- static find_connection(ctx, target, relation='tosca.relationships.ConnectsTo')
Find a relationship that this task can use to connect to the given instance. First look for relationship between the task’s target instance and the given instance. If none is found, see if there a default connection of the given type.
- Parameters
target (NodeInstance) – The instance to connect to.
relation (str, optional) – The relationship type. Defaults to
tosca.relationships.ConnectsTo
.ctx (unfurl.eval.RefContext) –
- Returns
The connection instance.
- Return type
RelationshipInstance or None
- find_instance(name)
- Parameters
name (str) –
- Return type
Optional[unfurl.runtime.NodeInstance]
- get_environment(addOnly, env=None)
Return a dictionary of environment variables applicable to this task.
- Parameters
addOnly (bool) – If addOnly is False all variables in the current os environment will be included otherwise only variables added will be included.
env (Optional[dict]) –
- Returns
dict:
- Return type
dict
Variable sources (by order of preference, lowest to highest): 1. The ensemble’s environment 2. Variables set by the connections that are available to this operation. 3. Variables declared in the operation’s
environment
section.
- get_settings()
- Return type
dict
- get_work_folder(location=None)
- Parameters
location (Optional[str]) –
- Return type
- property inputs: unfurl.result.ResultsMap
Exposes inputs and task settings as expression variables, so they can be accessed like:
eval: $inputs::param
or in jinja2 templates:
{{ inputs.param }}
- query(query, dependency=False, name=None, required=False, wantList=False, resolveExternal=True, strict=True, vars=None, throw=False, trace=None)
- Parameters
query (Union[str, dict]) –
dependency (bool) –
name (Optional[str]) –
required (bool) –
wantList (bool) –
resolveExternal (bool) –
strict (bool) –
vars (Optional[dict]) –
throw (bool) –
trace (Optional[int]) –
- Return type
Optional[Union[Any, unfurl.result.Result, List[unfurl.result.Result]]]
- remove_dependency(name)
- Parameters
name (str) –
- Return type
Optional[unfurl.configurator.Dependency]
- restore_envvars()
- sensitive(value)
Mark the given value as sensitive. Sensitive values will be encrypted or redacted when outputed.
- Returns
A copy of the value converted the appropriate subtype of
unfurl.logs.sensitive
value or the value itself if it can’t be converted.- Return type
- Parameters
value (object) –
- set_envvars()
Update os.environ with the task’s environ and save the current one so it can be restored by
restore_envvars
- set_work_folder(location='operation', preserve=None, always_apply=False)
- Parameters
location (str) –
preserve (Optional[bool]) –
always_apply (bool) –
- Return type
- update_instances(instances)
Notify Unfurl of new or changed instances made while the task is running.
This will queue a new child job if needed. To immediately run the child job based on the supplied spec, yield the returned JobRequest.
- Parameters
instances (Union[str, List[Dict[str, Any]]]) – Either a list or a string that is parsed as YAML.
- Return type
Tuple[Optional[unfurl.planrequests.JobRequest], List[unfurl.util.UnfurlTaskError]]
For example, this snipped creates a new instance and modifies the current target instance.
# create a new instance: - name: name-of-new-instance parent: HOST # or SELF or <instance name> # all other fields should match the YAML in an ensemble's status section template: aNodeTemplate attributes: anAttribute: aValue readyState: local: ok state: started # modify an existing instance: - name: SELF # the following fields are supported (all are optional): template: aNodeTemplate attributes: anAttribute: aNewValue ... artifacts: artifact1: ... readyState: local: ok state: started protected: true customized: true
- property vars: dict
A dictionary of the same variables that are available to expressions when evaluating inputs.
Internal classes supporting the runtime.
- class unfurl.support.NodeState(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)
An enumeration representing TOSCA Node States.
- configured = 5
- configuring = 4
- created = 3
- creating = 2
- deleted = 11
- deleting = 10
- error = 12
- initial = 1
- started = 7
- starting = 6
- stopped = 9
- stopping = 8
- class unfurl.support.Priority(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)
- critical = 3
- ignore = 0
- optional = 1
- required = 2
- class unfurl.support.Reason(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)
- add = 'add'
- check = 'check'
- connect = 'connect'
- degraded = 'degraded'
- error = 'error'
- force = 'force'
- missing = 'missing'
- prune = 'prune'
- reconfigure = 'reconfigure'
- run = 'run'
- stop = 'stop'
- undeploy = 'undeploy'
- update = 'update'
- upgrade = 'upgrade'
- class unfurl.support.Status(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)
- absent = 5
Instance confirmed to not exist.
- degraded = 2
Instance is operational but in a degraded state.
- error = 3
Instance is not operational.
- ok = 1
Instance is operational
- pending = 4
Instance is being brought up or hasn’t been created yet.
- unknown = 0
The operational state of the instance is unknown.
- class unfurl.result.ChangeRecord(jobId=None, startTime=None, taskId=0, previousId=None, parse=None)
A ChangeRecord represents a job or task in the change log file. It consists of a change ID and named attributes.
A change ID is an identifier with this sequence of 12 characters: - “A” serves as a format version identifier - 7 alphanumeric characters (0-9, A-Z, and a-z) encoding the date and time the job ran. - 4 hexadecimal digits encoding the task id
- Parameters
jobId (Optional[str]) –
startTime (Optional[datetime.datetime]) –
taskId (int) –
previousId (Optional[str]) –
parse (Optional[str]) –
- classmethod format_log(changeId, attributes)
format: changeidtkey=valuetkey=value
- Parameters
changeId (str) –
attributes (dict) –
- Return type
str
- log(attributes=None)
changeidtkey=valuetkey=value
- Parameters
attributes (Optional[dict]) –
- Return type
str
Project folders
An ensemble may contain create the following directories when a job runs:
- artifacts
Artifacts required for deployment (e.g. Terraform state files).
- secrets
sensitive artifacts (e.g. certificates). They are vault encrypted in the repository.
- local
Artifacts specific to this installation and so excluded from the repository (e.g. a pidfile)
- tasks
The most recently generated configuration files for each instance (for informational purposes only – excluded from repository and safe to delete).
Each of these directories will contain subdirectories named after each instance in the ensemble; their contents are populated as those instances are deployed.
When a plan is being generated, a directory named “planned” will be created which will have the same directory structure as above. When the job executes the plan those files will be moved to the corresponding directory if the task successfully deployed, otherwise it will be moved to a directory named failed.<changeid>
.
When a task runs, its configurator has access to these directories that it can use
to store artifacts in the ensemble’s repository or for generating local configuration files.
For this, each deployed instance can have its own set of directories (see _get_base_dir()
).
Because generating a plan should not impact what is currently deployed, during the
during planning and rendering phase, a configurator can use the WorkFolder
interface to read and write from temporary copies of those folders in the “planned” directory. They will either be discarded or moved to “active” if the task fails or succeeds.
This also enables the files generated by the plan to be manually examined – useful for development, error diagnosis and user intervention, or as part of a git-based approval process.
- class unfurl.projectpaths.WorkFolder(task, location, preserve)
When a task is running, this class provides access to the directories associated with instances and tasks, such as the directories for reading and writing artifacts and secrets.
Updates to these directories through this class are performed transactionally – accessing a directory through this class marks it for writing and creates a copy of it in the
planned
directory.If a task completes successfully
apply()
is called, which copies it back to the permanent location of the folder in the ensemble’s “active” directory.- Parameters
task (TaskView) –
location (str) –
preserve (bool) –
- always_apply = False
- apply()
- Return type
str
- copy_from(path)
- copy_to(path)
- property cwd
- discard()
- Return type
None
- failed()
- Return type
str
- get_current_path(path, mkdir=True)
- pending_path(path=None, mkdir=True)
An absolute path to the planning location of this directory.
- permanent_path(path, mkdir=True)
An absolute path to the permanent location of this directory.
- relpath_to_current(path)
- write_file(contents, name, encoding=None)
Create a file with the given contents
- Parameters
contents –
.
name (string) – Relative path to write to.
encoding (string) – (Optional) One of “binary”, “vault”, “json”, “yaml” or an encoding registered with the Python codec registry.
- Returns
An absolute path to the file.
- Return type
str
- unfurl.projectpaths._get_base_dir(ctx, name=None)
Returns an absolute path based on the given folder name:
- .
directory that contains the current instance’s ensemble
- Src
directory of the source file this expression appears in
- Artifacts
directory for the current instance (committed to repository).
- Local
The “local” directory for the current instance (excluded from repository)
- Secrets
The “secrets” directory for the current instance (files written there are vault encrypted)
- Tmp
A temporary directory for the current instance (removed after unfurl exits)
- Tasks
Job specific directory for the current instance (excluded from repository).
- Operation
Operation specific directory for the current instance (excluded from repository).
- Workflow
Workflow specific directory for the current instance (excluded from repository).
- Spec.src
The directory of the source file the current instance’s template appears in.
- Spec.home
Directory unique to the current instance’s TOSCA template (committed to the spec repository).
- Spec.local
Local directory unique to the current instance’s TOSCA template (excluded from repository).
- Project
The root directory of the current project.
- Unfurl.home
The location of home project (UNFURL_HOME).
Otherwise look for a repository with the given name and return its path or None if not found.
Runtime module
This module defines the core model and implements the runtime operations of the model.
The state of the system is represented as a collection of Instances. Each instance have a status; attributes that describe its state; and a TOSCA template which describes its capabilities, relationships and available interfaces for configuring and interacting with it.
- class unfurl.runtime.Operational
This is an abstract base class for Jobs, Resources, and Configurations all have a Status associated with them and all use the same algorithm to compute their status from their dependent resouces, tasks, and configurations
- static aggregate_status(statuses, seen)
Returns: ok, degraded, pending or None
If there are no instances, return None If any required are not operational, return pending or error If any other are not operational or degraded, return degraded Otherwise return ok. (Instances with priority set to “ignore” are ignored.)
- Parameters
statuses (Iterable[unfurl.runtime.Operational]) –
seen (Dict[int, unfurl.runtime.Operational]) –
- Return type
Optional[unfurl.support.Status]
- get_operational_dependencies()
Return an iterator of
Operational
objects that this instance directly depends on to be operational.- Return type
Iterable[unfurl.runtime.Operational]
- get_operational_dependents()
Return an iterator of
Operational
objects that directly depend on this instance to be operational.- Return type
Iterable[unfurl.runtime.Operational]
- has_changed(changeset)
Whether or not this object changed since the give ChangeRecord.
- Parameters
changeset (Optional[unfurl.result.ChangeRecord]) –
- Return type
bool
- class unfurl.runtime.OperationalInstance(status=None, priority=None, manualOveride=None, lastStateChange=None, lastConfigChange=None, state=None)
A concrete implementation of Operational
- Parameters
status (Optional[Union[OperationalInstance, int, str]]) –
priority (Optional[unfurl.support.Priority]) –
manualOveride (Optional[Union[OperationalInstance, int, str]]) –
lastStateChange (Optional[str]) –
lastConfigChange (Optional[str]) –
state (Optional[unfurl.support.NodeState]) –
- Return type
None
- get_operational_dependencies()
Return an iterator of
Operational
objects that this instance directly depends on to be operational.- Return type
Iterable[unfurl.runtime.Operational]
- property local_status: Optional[unfurl.support.Status]
The local_status property.
- property manual_override_status: Optional[unfurl.support.Status]
The manualOverideStatus property.
- property priority: Optional[unfurl.support.Priority]
The priority property.
- property state: Optional[unfurl.support.NodeState]
The state property.
APIs for controlling Unfurl
Localenv module
Classes for managing the local environment.
Repositories can optionally be organized into projects that have a local configuration.
By convention, the “home” project defines a localhost instance and adds it to its context.
- class unfurl.localenv.LocalEnv(manifestPath=None, homePath=None, parent=None, project=None, can_be_empty=False, override_context=None, overrides=None, readonly=False)
This class represents the local environment that an ensemble runs in, including the local project it is part of and the home project.
- Parameters
manifestPath (Optional[str]) –
homePath (Optional[str]) –
parent (Optional[unfurl.localenv.LocalEnv]) –
project (Optional[unfurl.localenv.Project]) –
can_be_empty (bool) –
override_context (Optional[str]) –
overrides (Optional[Dict[str, Any]]) –
readonly (Optional[bool]) –
- Return type
None
- find_git_repo(repoURL, revision=None)
- Parameters
repoURL (str) –
revision (Optional[str]) –
- Return type
Optional[unfurl.repo.GitRepo]
- find_or_create_working_dir(repoURL, revision=None, basepath=None, checkout_args={})
- Parameters
repoURL (str) –
revision (Optional[str]) –
basepath (Optional[str]) –
checkout_args (dict) –
- Return type
Tuple[Optional[unfurl.repo.GitRepo], Optional[str], Optional[bool]]
- find_path_in_repos(path, importLoader=None)
If the given path is part of the working directory of a git repository return that repository and a path relative to it
- Parameters
path (str) –
importLoader (Optional[Any]) –
- Return type
Tuple[Optional[unfurl.repo.GitRepo], Optional[str], Optional[str], Optional[bool]]
- find_project(testPath, stopPath=None)
Walk parents looking for unfurl.yaml
- Parameters
testPath (str) –
stopPath (Optional[str]) –
- Return type
Optional[unfurl.localenv.Project]
- get_context(context=None)
Return a new context that merges the given context with the local context.
- Parameters
context (Optional[dict]) –
- Return type
Dict[str, Any]
- get_external_manifest(location, skip_validation, safe_mode)
- Parameters
location (dict) –
skip_validation (bool) –
safe_mode (bool) –
- Return type
Optional[YamlManifest]
- get_local_instance(name, context)
- Parameters
name (str) –
context (dict) –
- Return type
Tuple[unfurl.runtime.NodeInstance, dict]
- get_manifest(path=None, skip_validation=False, safe_mode=None)
- Parameters
path (Optional[str]) –
skip_validation (bool) –
safe_mode (Optional[bool]) –
- Return type
YamlManifest
- get_paths()
Return a list of directories for $PATH. Includes the directory the
unfurl
script is installed in and, if asdf is installed, appends a PATH list from the.toolversions
found in the current project and the home project.- Return type
List[str]
- get_project(path, homeProject)
- Parameters
path (str) –
homeProject (Optional[unfurl.localenv.Project]) –
- Return type
- static get_runtime(ensemble_path, home_path)
- Parameters
ensemble_path (Optional[str]) –
home_path (Optional[str]) –
- Return type
Optional[str]
- get_vault()
- get_vault_password(vaultId='default')
- Parameters
vaultId (str) –
- Return type
Optional[str]
- link_repo(base_path, name, url, revision)
- Parameters
base_path (str) –
name (str) –
url (str) –
- Return type
Tuple[str, str]
- map_value(val, env_rules)
Evaluate using project home as a base dir.
- Parameters
val (Any) –
env_rules (Optional[dict]) –
- Return type
Any
- parent: Optional[unfurl.localenv.LocalEnv] = None
- project: Optional[unfurl.localenv.Project] = None
- class unfurl.localenv.Project(path, homeProject=None, overrides=None, readonly=False)
A Unfurl project is a folder that contains at least a local configuration file (unfurl.yaml), one or more ensemble.yaml files which maybe optionally organized into one or more git repositories.
- Parameters
path (str) –
homeProject (Optional[Project]) –
overrides (Optional[dict]) –
readonly (Optional[bool]) –
- add_context(name, value)
- Parameters
name (str) –
value (dict) –
- adjust_manifest_path(location, local_env)
- Parameters
location (dict) –
local_env (unfurl.localenv.LocalEnv) –
- Return type
str
- create_working_dir(gitUrl, ref=None)
- Parameters
gitUrl (str) –
ref (Optional[str]) –
- Return type
unfurl.repo.GitRepo
- find_ensemble_by_name(name)
- Parameters
name (str) –
- Return type
Optional[dict]
- find_ensemble_by_path(path)
- Parameters
path (str) –
- Return type
Optional[dict]
- find_git_repo(repoURL, revision=None)
- Parameters
repoURL (str) –
revision (Optional[str]) –
- Return type
Optional[unfurl.repo.GitRepo]
- find_git_repo_from_repository(repoSpec)
- Parameters
repoSpec (toscaparser.repositories.Repository) –
- Return type
Optional[unfurl.repo.GitRepo]
- find_or_clone(repo)
- Parameters
repo (unfurl.repo.GitRepo) –
- Return type
unfurl.repo.GitRepo
- find_or_create_working_dir(repoURL, revision=None)
- Parameters
repoURL (str) –
revision (Optional[str]) –
- Return type
unfurl.repo.GitRepo
- static find_path(testPath, stopPath=None)
Walk parents looking for unfurl.yaml
- Parameters
testPath (str) –
stopPath (Optional[str]) –
- Return type
Optional[str]
- find_path_in_repos(path, importLoader=None)
If the given path is part of the working directory of a git repository return that repository and a path relative to it
- Parameters
path (str) –
importLoader (Optional[Any]) –
- Return type
Tuple[Optional[unfurl.repo.RepoView], Optional[str], Optional[bool]]
- static get_asdf_paths(projectRoot, asdfDataDir, toolVersions={})
- Return type
List[str]
- get_context(contextName, context=None)
- Parameters
contextName (Optional[str]) –
context (Optional[dict]) –
- Return type
dict
- get_default_context()
- Return type
Optional[str]
- get_default_project_path(context_name)
- Parameters
context_name (str) –
- Return type
Optional[str]
- get_managed_project(location, localEnv)
- Parameters
location (dict) –
localEnv (unfurl.localenv.LocalEnv) –
- Return type
Optional[unfurl.localenv.Project]
- static get_name_from_dir(projectRoot)
- Parameters
projectRoot (str) –
- Return type
str
- get_relative_path(path)
- Parameters
path (str) –
- Return type
str
- get_unique_path(name)
- Parameters
name (str) –
- Return type
str
- get_vault_password(contextName=None, vaultId='default')
- Parameters
contextName (Optional[str]) –
vaultId (str) –
- Return type
Optional[str]
- get_vault_passwords(contextName=None)
- Parameters
contextName (Optional[str]) –
- Return type
Iterable[Tuple[str, Union[str, bytes]]]
- has_ensembles()
- Return type
bool
- is_path_in_project(path)
- Parameters
path (str) –
- Return type
bool
- load_yaml_include(yamlConfig, templatePath, baseDir, warnWhenNotFound=False, expanded=None, action=None)
This is called while the YAML config is being loaded. Returns (url or fullpath, parsed yaml)
- Parameters
yamlConfig (unfurl.yamlloader.YamlConfig) –
templatePath (Union[str, dict]) –
action (Optional[unfurl.yamlloader.LoadIncludeAction]) –
- make_vault_lib(contextName=None)
- Parameters
contextName (Optional[str]) –
- Return type
Optional[ansible.parsing.vault.VaultLib]
- property name: str
- static normalize_path(path)
- Parameters
path (str) –
- Return type
str
- project_repoview: unfurl.repo.RepoView
- register_ensemble(manifestPath, *, project=None, managedBy=None, context=None)
- Parameters
manifestPath (str) –
project (Optional[unfurl.localenv.Project]) –
managedBy (Optional[unfurl.localenv.Project]) –
context (Optional[str]) –
- Return type
None
- register_project(project, for_context=None, changed=False, save_project=True)
- reload()
- search_for_default_manifest()
- Return type
Optional[str]
- property venv: Optional[str]
Job module
A Job is generated by comparing a list of specs with the last known state of the system. Job runs tasks, each of which has a configuration spec that is executed on the running system Each task tracks and records its modifications to the system’s state
- class unfurl.job.ConfigChange(parentJob=None, startTime=None, status=None, previousId=None, **kw)
Represents a configuration change made to the system. It has a operating status and a list of dependencies that contribute to its status. There are two kinds of dependencies:
Live resource attributes that the configuration’s inputs depend on.
Other configurations and resources it relies on to function properly.
- Parameters
parentJob (Optional[Job]) –
startTime (Optional[datetime.datetime]) –
status (Optional[Union[unfurl.runtime.OperationalInstance, int, str]]) –
previousId (Optional[str]) –
kw (Any) –
- Return type
None
- class unfurl.job.Job(manifest, rootResource, jobOptions, previousId=None)
runs ConfigTasks and child Jobs
- Parameters
manifest (YamlManifest) –
rootResource (unfurl.runtime.TopologyInstance) –
jobOptions (unfurl.job.JobOptions) –
previousId (Optional[str]) –
- Return type
None
- can_run_task(task, not_ready)
Checked at runtime right before each task is run
validate inputs
check pre-conditions to see if it can be run
check task if it can be run
- Parameters
task (unfurl.job.ConfigTask) –
not_ready (Sequence[unfurl.planrequests.PlanRequest]) –
- Return type
Tuple[bool, str]
- get_operational_dependencies()
Return an iterator of
Operational
objects that this instance directly depends on to be operational.- Return type
Iterable[unfurl.job.ConfigTask]
- run_task(task, not_ready, depth=0)
During each task run: * Notification of metadata changes that reflect changes made to resources * Notification of add or removing dependency on a resource or properties of a resource * Notification of creation or deletion of a resource * Requests a resource with requested metadata, if it doesn’t exist, a task is run to make it so (e.g. add a dns entry, install a package).
Returns a task.
- Parameters
task (unfurl.job.ConfigTask) –
not_ready (Sequence[unfurl.planrequests.PlanRequest]) –
depth (int) –
- Return type
unfurl.job.ConfigTask
- should_run_task(task)
Checked before rendering the task.
- Parameters
task (unfurl.job.ConfigTask) –
- Return type
Tuple[bool, str]
- class unfurl.job.JobOptions(**kw)
Options available to select which tasks are run, e.g. read-only
- Parameters
kw (Any) –
- Return type
None
- unfurl.job.run_job(manifestPath=None, _opts=None)
Loads the given Ensemble and creates and runs a job.
- Parameters
manifestPath (
str
, optional) – If None, it will look for an ensemble in the current working directory._opts (
dict
, optional) – the names of the command line options for creating jobs.
- Returns
The job that just ran or None if it couldn’t be created.
- Return type
(
Job
)
Init module
This module implements creating and cloning project and ensembles as well Unfurl runtimes.
- unfurl.init.clone(source, dest, ensemble_name='ensemble', **options)
Clone the
source
project or ensemble todest
. Ifdest
isn’t in a project, create a new one.source
can be a git URL or a path inside a local git repository. Git URLs can specify a particular file in the repository using an URL fragment like#<branch_or_tag>:<file/path>
. You can use cloudmap url likecloudmap:<package_id>
, which will resolve to a git URL.If
source
can point to an Unfurl project, an ensemble template, a service template, an existing ensemble, or a folder containing one of those.The result of the clone depends on the destination:
dest
Result
Inside source project
New or forked ensemble (depending on source)
Missing or empty folder
Clone project, create new ensemble if missing
Another project
See below
Non-empty folder
Error, abort
When creating a new ensemble from a source, if the source points to:
an ensemble: fork the ensemble (clone without status and new uri)
an ensemble template or TOSCA service template: a create new ensemble from the template.
a project: If the project includes a ensemble-template.yaml, use that; if missing, fork the project’s default ensemble.
When dest is set to another project, clone’s behavior depends on source:
If the source is a local file path, the project and local repository is registered in the destination project and a new ensemble is created based on the source.
If the source is a git URL, the repository is cloned inside the destination project. A new ensemble is only created if the source specified a specific ensemble or template or if the source was blueprint project (i.e. it contains an ensemble template but doesn’t contain any ensembles).
When deploying an ensemble that is in project that was cloned into another project, the environment setting in each unfurl.yaml are merged, with the top-level project’s settings taking precedence.
- Parameters
source (str) –
dest (str) –
ensemble_name (str) –
options (Any) –
- Return type
str
Utility classes and functions
- class unfurl.logs.sensitive
Base class for marking a value as sensitive. Depending on the context, sensitive values will either be encrypted or redacted when outputed.
- exception unfurl.util.UnfurlError(message, saveStack=False, log=False)
- Parameters
message (object) –
saveStack (bool) –
log (bool) –
- Return type
None
- exception unfurl.util.UnfurlTaskError(task, message, log=40, dependency=None)
- Parameters
task (TaskView) –
message (object) –
log (int) –
- unfurl.util.filter_env(rules, env=None, addOnly=False, sub=None)
Applies the given list of rules to a dictionary of environment variables and returns a new dictionary.
- Parameters
rules (dict) – A dictionary of rules for adding, removing and filtering environment variables.
env (dict, optional) – The environment to apply the give rules to. If
env
is None it will be set toos.environ
. Defaults to None.addOnly (bool, optional) – If addOnly is False (the default) all variables in
env
will be included in the returned dict, otherwise only variables added byrules
will be includedsub (Optional[MutableMapping]) –
- Return type
Dict[str, str]
Rules applied in the order they are declared in the
rules
dictionary. The following examples show the different patterns for the rules:- foo: bar
Add
foo=bar
- +foo
Copy
foo
from the current environment- +foo: bar
Copy
foo
, or addfoo=bar
if it is not present- +foo*
Copy all name from the current environment that match
foo*
- +!foo*
Copy all name from the current environment except those matching
foo*
- -!foo
Remove all names except for
foo
- ^foo: /bar/bin
Treat
foo
likePATH
and prepend/bar/bin:$foo
- class unfurl.util.sensitive_bytes
Transparent wrapper class to mark bytes as sensitive
- decode(*args, **kwargs)
Wrapper method to ensure type conversions maintain sensitive context
- Parameters
args (List[object]) –
kwargs (collections.abc.Mapping) –
- Return type
- class unfurl.util.sensitive_dict
Transparent wrapper class to mark a dict as sensitive
- class unfurl.util.sensitive_list(iterable=(), /)
Transparent wrapper class to mark a list as sensitive
- class unfurl.util.sensitive_str
Transparent wrapper class to mark a str as sensitive
- encode(*args, **kwargs)
Wrapper method to ensure type conversions maintain sensitive context
- Parameters
args (List[object]) –
kwargs (collections.abc.Mapping) –
- Return type
Eval module
Public Api:
map_value - returns a copy of the given value resolving any embedded queries or template strings
Ref.resolve given an expression, returns a list of values or Result Ref.resolve_one given an expression, return value, none or a list of values Ref.is_ref return true if the given diction looks like a Ref
Internal:
eval_ref() given expression (string or dictionary) return list of Result Expr.resolve() given expression string, return list of Result Results._map_value same as map_value but with lazily evaluation
- class unfurl.eval.Ref(exp, vars=None, trace=None)
A Ref objects describes a path to metadata associated with a resource.
- Parameters
exp (Union[str, collections.abc.Mapping]) –
vars (Optional[dict]) –
trace (Optional[int]) –
- Return type
None
- resolve(ctx: RefContext, wantList: Literal[True] = True, strict: Optional[bool] = None) List[Any]
- resolve(ctx: RefContext, wantList: Literal[False], strict: Optional[bool] = None) ResolveOneUnion
- resolve(ctx: RefContext, wantList: str, strict: Optional[bool] = None) List[Result]
- resolve(ctx: RefContext, wantList: Union[bool, str] = True, strict: Optional[bool] = None) Union[List[Result], List[Any], ResolveOneUnion]
If wantList=True (default) returns list of values Note that values in the list can be a list or None If wantList=False return
resolve_one
semantics If wantList=’result’ return a list of Result
- resolve_one(ctx, strict=None)
If no match return None If more than one match return a list of matches Otherwise return the match
Note: If you want to distinguish between None values and no match or between single match that is a list and a list of matches use resolve() which always returns a (possible empty) of matches
- Parameters
ctx (unfurl.eval.RefContext) –
strict (Optional[bool]) –
- Return type
Union[None, Any, List[Any]]
- class unfurl.eval.RefContext(currentResource, vars=None, wantList=False, resolveExternal=False, trace=None, strict=None, task=None)
The context of the expression being evaluated.
- Parameters
currentResource (ResourceRef) –
vars (Optional[dict]) –
wantList (Optional[Union[bool, str]]) –
resolveExternal (bool) –
trace (Optional[int]) –
strict (Optional[bool]) –
task (Optional[TaskView]) –
- Return type
None
- unfurl.eval.eval_ref(val, ctx, top=False)
val is assumed to be an expression, evaluate and return a list of Result
- Parameters
val (Union[collections.abc.Mapping, str]) –
ctx (unfurl.eval.RefContext) –
top (bool) –
- Return type
List[unfurl.result.Result]
- unfurl.eval.map_value(value, resourceOrCxt, applyTemplates=True)
Resolves any expressions or template strings embedded in the given map or list.
- Parameters
value (Any) –
resourceOrCxt (Union[unfurl.eval.RefContext, unfurl.result.ResourceRef]) –
applyTemplates (bool) –
- Return type
Any
Graphql module
This module defines GraphQL representations of ensembles and Unfurl environments, including simplified representations of TOSCA types and node templates as documented by the GraphQL schema below.
These objects are exported as JSON by the export command and by unfurl server API endpoints.
type DeploymentTemplate {
name: String!
title: String!
slug: String!
description: String
visibility: String
metadata: JSON
blueprint: ApplicationBlueprint!
primary: ResourceTemplate!
resourceTemplates: [ResourceTemplate!]
cloud: ResourceType
environmentVariableNames: [String!]
source: String
branch: String
commitTime: String
}
type Deployment {
title: String!
primary: Resource
resources: [Resource!]
deploymentTemplate: DeploymentTemplate!
url: url
status: Status
summary: String
workflow: String
deployTime: String
packages: JSON
}
type ResourceType {
name: String!
title: String
extends: [ResourceType!]
description: String
badge: String
icon: String
visibility: String
details_url: String
inputsSchema: JSON
computedPropertiesSchema: JSON
outputsSchema: JSON
requirements: [RequirementConstraint!]
implementations: [String]
implementation_requirements: [String]
directives: [String]
metadata: JSON
_sourceinfo: JSON
}
type RequirementConstraint {
name: String!
title: String
description: String
resourceType: ResourceType!
match: ResourceTemplate
min: Int
max: Int
badge: String
visibility: String
icon: String
inputsSchema: Required[JSON]
requirementsFilter: [RequirementConstraint!]
}
type ResourceTemplate {
name: String!
title: String
type: ResourceType!
visibility: String
directives: [String!]
imported: String
metadata: JSON
description: string
# Maps to an object that conforms to type.inputsSchema
properties: [Input!]
dependencies: [Requirement!]
}
type Requirement {
name: String!
constraint: RequirementConstraint!
match: ResourceTemplate
target: Resource
visibility: String
}
type Resource {
name: String!
title: String!
url: String
template: ResourceTemplate!
status: Status
state: State
attributes: [Input!]
computedProperties: [Input!]
connections: [Requirement!]
protected: Boolean
imported: String
}
type DeploymentEnvironment {
name: String!
connections: [ResourceTemplate!]
instances: [ResourceTemplate!]
primary_provider: ResourceTemplate
repositories: JSON!
}
type DeploymentPath {
name: String!
environment: String!
project_id: String
pipelines: [JSON!]
incremental_deploy: boolean!
}
type ApplicationBlueprint {
name: String!
title: String
description: String
primary: ResourceType!
primaryDeploymentBlueprint: String
deploymentTemplates: [DeploymentTemplate!]
projectPath: String
blueprintPath: String
livePreview: String
sourceCodeUrl: String
image: String
projectIcon: String
}