Python API

API for writing service templates

See here for an overview of the TOSCA Python DSL.

TOSCA Types

The classes in this section are used to define TOSCA types, such as Nodes, Relationships, Capabilities, Artifacts, and Data Types. TOSCA Type objects correspond to a template (such as a node template or relationship template) defined in TOSCA service template.

class tosca.ToscaType(_name='', *, _metadata=<factory>)

Base class for TOSCA type definitions.

ToscaTypes are Python dataclasses with custom fields that correspond to TOSCA’s field types including properties, requirements, capabilities, and artifacts. You don’t need to use Python’s dataclass decorators or functions directly.

Any public field (i.e. not starting with _) will be included in the TOSCA YAML for the template, using the field’s type annotation to deduce which TOSCA (e.g. Nodes are treated as requirements, Artifacts are treated as artifacts, etc. defaulting to TOSCA properties). This can be customized using TOSCA Field Specifiers.

class Example(tosca.nodes.Root):
    # Node type, so it's a TOSCA requirement
    host: tosca.nodes.Compute

    # an artifact
    shellScript: tosca.artifacts.ImplementationBash

    # other types default to TOSCA properties
    location: str

    # use the field specifies to customize
    dns: str = Property(constraints=[max_length(20)], options=sensitive)

    # ignored, starts with "_"
    _internal: str

When your Python code is first executed, e.g. when it is imported into a TOSCA service template, it executes in "parse" mode, to prepare the object so it can be translated to TOSCA YAML.

In parse mode, user code accessing TOSCA fields may appear to return concrete values but actually (thanks to type punning) they are returning FieldProjection object that are used to record the relationships needed to generate the TOSCA YAML consumed by the orchestrator.

When the orchestrator is executing methods defined on ToscaType objects invoked when running TOSCA operations, through tosca.Computed properties, and via the Self variable in Jinja2 templates. In that context the global_state_mode will be set to “runtime” mode. In runtime mode, the object proxies the instance created from the TOSCA template – the values of its TOSCA fields will be corresponding the values on the instance.

Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

Return type

None

_template_init()

User-defined callback invoked when the template object is initialized. Define this method in your subclass in lieu of __init__ or __post_init__. Compared to _class_init, it is invoked for each template instead of during class definition.

You can set the default value of fields in your class definition to DEFAULT to indicate they will be assigned a value in the _template_init method. If they are not set in _template_init, they will be set to their default value automatically.

You can check if a field has its default value by calling has_default (e.g. if the field wasn’t specified as keyword argument when creating the template).

class Example(tosca.nodes.Root):
    # DEFAULT will create new object per instance (here a Compute node template).
    # A runtime error will be raised if the type can't be created with default values.
    host: tosca.nodes.Compute = tosca.DEFAULT

    def _template_init(self) -> None:
        if self.has_default("host"):
          # only set if the template didn't specify a host when it was created
          self.host = tosca.nodes.Compute(mem_size=4 * GB)

        # or (better, enables static type checking that the field has been declared):
        if self.has_default(self.__class__.host):
           self.host = tosca.nodes.Compute(mem_size=4 * GB)
Return type

None

classmethod _class_init()

A user-defined class method that is called when the class definition is being initialized, specifically before Python’s dataclass machinery is invoked. Define this method in a subclass if you want to customize your class definition.

Inside this method referencing a class fields will return a FieldProjection but this is hidden from the static type checker and IDE through type punning.

You can set the default value of fields in your class definition to CONSTRAINED to indicate they will be configured in your _class_init method.

Return type

None

class Example(tosca.nodes.Root):

    # set default to CONSTRAINED to indicate a value will be assigned in _class_init
    host: tosca.nodes.Compute = tosca.CONSTRAINED

    @classmethod
    def _class_init(cls) -> None:
        # the proxy node template created here will be shared by all instances of Example unless one sets its own.
        cls.host = tosca.nodes.Compute()

        # Constrain the memory size of the host compute.
        # this will apply to all instances even if one sets its own Compute instance.
        # (The generated YAML for the host requirement on this node type will include a node_filter with an in_range property constraint).
        in_range(2 * gb, 20 * gb).apply_constraint(cls.host.mem_size)
__getattribute__(name)

Customizes attribute access. In parse mode, accessing a attribute that is a TOSCA field will return a FieldProjection instead of its value if any of the following are true:

  • the field hasn’t been initialized yet (e.g. its value is DEFAULT, REQUIRED, or CONSTRAINED)

  • the field is a TOSCA Attribute (as their values are set at runtime)

  • the value is EvalData

  • the value is a ToscaType object set directly on the class the definition.

Parameters

name (str) –

find_configured_by(field_name: str | FieldProjection)

Transitively search for field_name along the .configured_by axis (see Special keys) and return the first match.

For example:

class A(Node):
  pass

class B(Node):
  url: str
  connects_to: A = tosca.Requirement(relationship=unfurl.relationships.Configures)

a = A()
b = B(connects_to=a, url="https://example.com")

>>> a.find_configured_by(B.url)
"https://example.com"

If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will evaluate in the current context.

Parameters

field_name (str | FieldProjection) – Either the name of the field, or for a more type safety, a reference to the field (e.g. B.url in the example above).

Returns

The value of the referenced field

Return type

Any

find_hosted_on(field_name: str | FieldProjection)

Transitively search for field_name along the .hosted_on axis (see Special keys) and return the first match.

class A(Node):
  url: str

class B(Node):
  host: A = tosca.Requirement(relationship=tosca.relationships.HostedOn)

a = A(url="https://example.com")
b = B(host=a)

>>> b.find_hosted_on(A.url)
"https://example.com"

If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will evaluate in the current context.

Parameters

field_name (str | FieldProjection) – Either the name of the field, or for a more type safety, a reference to the field (e.g. A.url in the example above).

Returns

The value of the referenced field

Return type

Any

from_owner(field_name: str | FieldProjection)

Return field_name on the .owner of the template (see Special keys).

class App(Node):
  name: str
  my_artifact: MyArtifact = MyArtifact(app_name=from_owner("name")))

a = App(name="foo")
assert a.my_artifact.app_name == a.name
# also available as a method:
assert a.my_artifact.from_owner(App.name) == a.name

If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will use the current context’s instance. If the template instance is an embedded template (an artifact, a relationship, a data entity, or a capability), the field will be evaluated on the node template the template is embedded in, otherwise it will be evaluated on the template itself.

Parameters

field_name (str | FieldProjection) – Either the name of the field, or for a more type safety, a reference to the field (e.g. A.url in the example above).

Returns

The value of the referenced field

Return type

Any

get_ref(ref)

Return a reference to a field. Raises AttributeError if the field doesn’t exist and TypeError if the FieldProjection is for a different class than self.

Parameters

ref (str | FieldProjection) – Either the name of the field, or, for more type safety, a reference to the field (e.g. MyType.my_prop).

Returns

FieldProjection

Return type

tosca._tosca.FieldProjection

has_default(ref)

Return True if the attribute has its default value from the class definition or hasn’t been set at all.

Parameters

ref (str | FieldProjection) – Either the name of the field, or, for more type safety, a reference to the field (e.g. MyType.my_prop).

Return type

bool

classmethod set_to_property_source(requirement, property)

Sets the given requirement to the TOSCA template that provided the value of “property”.

For example, if A.property = B.property then A.set_to_property_source("requirement", "property") will create a node filter for A.requirement that selects B.

The requirement and property have to be defined on the same class. The method should be called from _class_init(cls).

Parameters
Raises

TypeError – If requirement or property are missing from cls.

Return type

None

The requirement and property names can also be strings, e.g.:

cls.set_to_property_source("requirement", "property")

Note that cls.set_to_property_source(cls.requirement, cls.property)

is equivalent to:

cls.requirement = cls.property if called within _class_init(cls),

but using this method will avoid static type checker complaints.

set_operation(op, name=None)

Assign the given TOSCA operation to this TOSCA object. TOSCA allows operations to be defined directly on templates.

Parameters
  • op (Callable[[Concatenate[tosca._tosca.ToscaType, ...]], Any]) – A function implements the operation. It should looks like a method, i.e. accepts Self as the first argument. Using the tosca.operation function decorator is recommended but not required.

  • name (Optional[Union[str, Callable[[Concatenate[tosca._tosca.ToscaType, ...]], Any]]]) – The TOSCA operation name. If omitted, op’s operation_name or the op’s function name is used.

Return type

Self

Returns Self to allow chaining.

class tosca.Node(_name='', *, _metadata=<factory>, _directives=<factory>, _node_filter=None)

A TOSCA node template.

Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

  • _directives (List[str]) –

  • _node_filter (Optional[Dict[str, Any]]) –

Return type

None

find_required_by(requirement_name: str | FieldProjection, expected_type: Type[Node] | None = None)

Finds the node template with a requirement named requirement_name whose value is this template.

For example:

class A(Node):
  pass

class B(Node):
  connects_to: A

a = A()
b = B(connects_to=a)

>>> a.find_required_by(B.connects_to, B)
b

If no match is found, or more than one match is found, an error is raised. If 0 or more matches are expected, use find_all_required_by.

If called during class definition this will return an eval expression. If called as a classmethod or as a free function it will evaluate in the current context.

For example, to expand on the example above:

class A(Node):
  parent: B = find_required_by(B.connects_to, B)

parent will default to an eval expression.

Parameters
  • requirement_name (str | FieldProjection) – Either the name of the req, or for a more type safety, a reference to the requirement (e.g. B.connects_to in the example above).

  • expected_type (Node, optional) – The expected type of the node template will be returned. If provided, enables static typing and runtime validation of the return value.

Returns

The node template that is targeting this template via the requirement.

Return type

Node

find_all_required_by(requirement_name: str | FieldProjection, expected_type: Type[Node] | None = None)

Behaves the same as find_required_by but returns a list of all the matches found. If no match is found, return an empty list.

Parameters
  • requirement_name (str | FieldProjection) – Either the name of the req, or for a more type safety, a reference to the requirement (e.g. B.connects_to in the example above).

  • expected_type (Node, optional) – The expected type of the node template will be returned. If provided, enables static typing and runtime validation of the return value.

Return type

List[tosca.Node]

class tosca.Relationship(_name='', *, _metadata=<factory>, _local_name=None, _node=None, _default_for=None, _target=None)
Parameters
Return type

None

class tosca.CapabilityEntity(_name='', *, _metadata=<factory>, _local_name=None, _node=None)
Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

  • _local_name (Optional[str]) –

  • _node (Optional[tosca._tosca.Node]) –

Return type

None

class tosca.DataEntity(_name='', *, _metadata=<factory>, _local_name=None, _node=None)
Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

  • _local_name (Optional[str]) –

  • _node (Optional[tosca._tosca.Node]) –

Return type

None

class tosca.ArtifactEntity(_name='', *, _metadata=<factory>, _local_name=None, _node=None, file='', repository=None, deploy_path=None, version=None, checksum=None, checksum_algorithm=None, permissions=None, intent=None, target=None, order=None, contents=None, dependencies=None)
Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

  • _local_name (Optional[str]) –

  • _node (Optional[tosca._tosca.Node]) –

  • file (str) –

  • repository (Optional[str]) –

  • deploy_path (Optional[str]) –

  • version (Optional[str]) –

  • checksum (Optional[str]) –

  • checksum_algorithm (Optional[str]) –

  • permissions (Optional[str]) –

  • intent (Optional[str]) –

  • target (Optional[str]) –

  • order (Optional[int]) –

  • contents (Optional[str]) –

  • dependencies (Optional[List[Union[str, Dict[str, str]]]]) –

Return type

None

class tosca.Group(_name='', *, _metadata=<factory>, _members=())
Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

  • _members (Sequence[Union[tosca._tosca.Node, tosca._tosca.Group]]) –

Return type

None

class tosca.Policy(_name='', *, _metadata=<factory>, _targets=())
Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

  • _targets (Sequence[Union[tosca._tosca.Node, tosca._tosca.Group]]) –

Return type

None

TOSCA Field Specifiers

The follow are functions that are used as field specified when declaring attributes on TOSCA Types. Use these if you need to specify TOSCA specific information about the field or if the TOSCA field type can’t be inferred from the Python’s attribute’s type. For example:

class MyNode(tosca.nodes.Root):
    a_tosca_property: str = Property(name="a-tosca-property", default=None, metadata={"foo": "bar"})

Note that these functions all take keyword-only parameters (this is needed for IDE integration).

tosca.Artifact(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', metadata=None, options=None)

Field specifier for declaring a TOSCA artifact.

Parameters
  • default (Any, optional) – Default value. Set to None if the artifact isn’t required. Defaults to MISSING.

  • factory (Callable, optional) – Factory function to initialize the artifact with a unique value per template. Defaults to MISSING.

  • name (str, optional) – TOSCA name of the field, overrides the artifact’s name when generating YAML. Defaults to “”.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the artifact.

  • options (Options, optional) – Additional typed metadata to merge into metadata.

Return type

Any

tosca.Attribute(*, default=None, factory=<dataclasses._MISSING_TYPE object>, name='', constraints=None, metadata=None, title='', status='', options=None, init=False)

Field specifier for declaring a TOSCA attribute.

Parameters
  • default (Any, optional) – Default value. Set to None if the attribute isn’t required. Defaults to MISSING.

  • factory (Callable, optional) – Factory function to initialize the attribute with a unique value per template. Defaults to MISSING.

  • name (str, optional) – TOSCA name of the field, overrides the attribute’s name when generating YAML. Defaults to “”.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the attribute.

  • options (Options, optional) – Additional typed metadata to merge into metadata.

  • constraints (List[DataConstraint], optional) – List of TOSCA property constraints to apply to the attribute.

  • title (str, optional) – Human-friendly alternative name of the attribute.

  • status (str, optional) – TOSCA status of the attribute.

  • init (Literal[False]) –

Return type

Any

tosca.Capability(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', metadata=None, options=None, valid_source_types=None)

Field specifier for declaring a TOSCA capability.

Parameters
  • default (Any, optional) – Default value. Set to None if the capability isn’t required. Defaults to MISSING.

  • factory (Callable, optional) – Factory function to initialize the capability with a unique value per template. Defaults to MISSING.

  • name (str, optional) – TOSCA name of the field, overrides the capability’s name when generating YAML. Defaults to “”.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the capability.

  • options (Options, optional) – Additional typed metadata to merge into metadata.

  • valid_source_types (List[str], optional) – List of TOSCA type names to set as the capability’s valid_source_types

Return type

Any

tosca.Computed(name='', *, factory, metadata=None, title='', status='', options=None, attribute=False)

Field specifier for declaring a TOSCA property whose value is computed by the factory function at runtime.

Parameters
  • factory (function) – function called at runtime every time the property is evaluated.

  • name (str, optional) – TOSCA name of the field, overrides the Python name when generating YAML.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the property.

  • title (str, optional) – Human-friendly alternative name of the property.

  • status (str, optional) – TOSCA status of the property.

  • options (Options, optional) – Typed metadata to apply.

  • attribute (bool, optional) – Indicate that the property is also a TOSCA attribute.

Return type

tosca._fields.RT

Return type:

The return type of the factory function (should be compatible with the field type).

tosca.Property(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', constraints=None, metadata=None, title='', status='', options=None, attribute=False)

Field specifier for declaring a TOSCA property.

Parameters
  • default (Any, optional) – Default value. Set to None if the property isn’t required. Defaults to MISSING.

  • factory (Callable, optional) – Factory function to initialize the property with a unique value per template. Defaults to MISSING.

  • name (str, optional) – TOSCA name of the field, overrides the property’s name when generating YAML. Defaults to “”.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the property.

  • options (Options, optional) – Additional typed metadata to merge into metadata.

  • constraints (List[DataConstraint], optional) – List of TOSCA property constraints to apply to the property.

  • title (str, optional) – Human-friendly alternative name of the property.

  • status (str, optional) – TOSCA status of the property.

  • attribute (bool, optional) – Indicate that the property is also a TOSCA attribute. Defaults to False.

Return type

Any

tosca.Requirement(*, default=<dataclasses._MISSING_TYPE object>, factory=<dataclasses._MISSING_TYPE object>, name='', metadata=None, options=None, relationship=None, capability=None, node=None, node_filter=None)

Field specifier for declaring a TOSCA requirement.

Parameters
  • default (Any, optional) – Default value. Set to None if the requirement isn’t required. Defaults to MISSING.

  • factory (Callable, optional) – Factory function to initialize the requirement with a unique value per template. Defaults to MISSING.

  • name (str, optional) – TOSCA name of the field, overrides the requirement’s name when generating YAML. Defaults to “”.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the requirement.

  • options (Options, optional) – Additional typed metadata to merge into metadata.

  • relationship (str | Type[Relationship], optional) – The requirement’s relationship specified by TOSCA type name or Relationship class.

  • capability (str | Type[CapabilityEntity], optional) – The requirement’s capability specified by TOSCA type name or CapabilityEntity class.

  • node (str, | Type[Node], optional) – The requirement’s node specified by TOSCA type name or Node class.

  • node_filter (Dict[str, Any], optional) – The TOSCA node_filter for this requirement.

Return type

Any

tosca.operation(func=None, *, name='', apply_to=None, timeout=None, operation_host=None, environment=None, dependencies=None, outputs=None, entry_state=None, invoke=None, metadata=None)

Function decorator that marks a function or methods as a TOSCA operation.

Parameters
  • name (str, optional) – Name of the TOSCA operation. Defaults to the name of the method.

  • apply_to (Sequence[str], optional) – List of TOSCA operations to apply this method to. If omitted, match by the operation name.

  • timeout (float, optional) – Timeout for the operation (in seconds). Defaults to None.

  • operation_host (str, optional) – The name of host where this operation will be executed. Defaults to None.

  • environment (Dict[str, str], optional) – A map of environment variables to use while executing the operation. Defaults to None.

  • dependencies (List[Union[str, Dict[str, Any]]], optional) – List of artifacts this operation depends on. Defaults to None.

  • outputs (Dict[str, str], optional) – TOSCA outputs mapping. Defaults to None.

  • entry_state (str, optional) – Node state required to invoke this operation. Defaults to None.

  • invoke (str, optional) – Name of operation to delegate this operation to. Defaults to None.

  • metadata (Dict[str, JSON], optional) – Dictionary of metadata to associate with the operation. Defaults to None.

  • func (Optional[Union[tosca._tosca.F, tosca._tosca.ArtifactEntity]]) –

Return type

Union[tosca._tosca.F, Callable[[tosca._tosca.F], tosca._tosca.F]]

This example marks a method a implementing the create and delete operations on the Standard TOSCA interface.

@operation(apply_to=["Standard.create", "Standard.delete"])
def default(self):
    return self.my_artifact.execute()

If you wish to declare an abstract operation on a custom interface without specifying its signature, assign operation() directly, for example:

class MyInterface(tosca.interfaces.Root):
    my_operation = operation()
    "Invoke this method to perform my_operation"

This will avoid static type-check errors when subclasses declare a method implementing the operation.

You can also use it to create an operation from an Artifact without having to define a method.

# set the "configure" operation on "my_node" with the given artifact as its implementation.
my_node = MyNode().set_operation(operation(ShellExecutable("configure", command="./script.sh {{ SELF.prop }}")))

Property Constraints

class tosca.DataConstraint(constraint)

Base class for TOSCA property constraints. There’s a DataConstraint subclass with the same name as each of those constraints.

These can be passed as Property and Attribute field specifiers or as a Python type annotations.

Parameters

constraint (tosca._tosca.T) –

class tosca.equal(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.greater_than(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.less_than(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.greater_or_equal(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.less_or_equal(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.in_range(min, max)
Parameters
  • min (tosca._tosca.T) –

  • max (tosca._tosca.T) –

class tosca.valid_values(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.length(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.min_length(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.max_length(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.pattern(constraint)
Parameters

constraint (tosca._tosca.T) –

class tosca.schema(constraint)
Parameters

constraint (tosca._tosca.T) –

Other Module Items

tosca.Eval(value)

Use this function to specify that a value is or contains a TOSCA function or eval expressions. For example, for property default values.

Parameters

value (Any) –

Return type

Any

class tosca.EvalData(expr, path=None)

An internal wrapper around JSON/YAML data that may contain TOSCA functions or Eval Expressions and will be evaluated at runtime.

Parameters
  • expr (Union[EvalData, str, None, Dict[str, Any], List[Any], Callable]) –

  • path (Optional[List[Union[str, tosca._tosca._GetName]]]) –

class tosca.FieldProjection(field, parent=None, obj=None)

An EvalData subclass that references a TOSCA field or a projection off a tosca field (i.e. a field reference from a field reference). These are generally invisible to user code but during “parse” mode, ToscaType attribute references return these “under the hood” but we use type punning so that a Python static type checker won’t know this, allowing user code to operate on them as if they were the concrete values declared on the class. Since field projections are used to generate the YAML that a TOSCA orchestrator consumes, a Python type checker can catch validation errors before a blueprint is deployed.

You can explicitly get a FieldProjection using get_ref

Parameters
  • field (tosca._tosca._Tosca_Field) –

  • parent (Optional[FieldProjection]) –

class tosca.Interface

Base class for defining custom TOSCA Interface Types. Each public method (without a leading “_”) corresponds to a TOSCA tosca.operation. TOSCA types such as Nodes use multiple inheritance to implement interface operations.

_interface_requirements: ClassVar[Optional[List[str]]] = None

Set this in the class definition to include as the requirements key on the TOSCA YAML interface declarations (an Unfurl extension for specifying the Connections that its operations need to function).

_type_metadata: ClassVar[Optional[Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]]] = None

Set this in the class definition to set metadata for the TOSCA interface type.

_type_name: ClassVar[str] = ''

Set this in the class definition if the TOSCA type name is different from the class name.

class tosca.Repository(url, *, name=None, revision=None, credential=None, description=None, metadata=None)

Create a TOSCA Repository object. Repositories can represent, for example, a git repository, a container image or artifact registry or a file path to a local directory.

Repositories that contain the Python modules imported into a service template are typically declared in the YAML so an orchestrator like Unfurl can map them to tosca_repositories imports. However you have a Python service template with code that references a repository (e.g. in a abspath eval expression) and that service template is imported directly by other projects, you should declare the repository in the same Python file.

Parameters
  • url (str) – The url of the repository.

  • name (Optional[str]) – The name of the repository. If not set or None, the Python identifier assigned to the object will be used.

  • revision (Optional[str]) – The revision of the repository to use (e.g. a semantic version or git tag or branch). Defaults to None.

  • description (Optional[str]) – A human-readable description of the repository. Defaults to None.

  • credential (Optional[tosca.datatypes.Credential]) – The credential to use when accessing the repository. Defaults to None.

  • metadata (Optional[Dict[str, Any]]) – Additional metadata about the repository. Defaults to None.

Example:

std = tosca.Repository("https://unfurl.cloud/onecommons/std",
            credential=tosca.datatypes.Credential(user="a_user", token=expr.get_env("MY_TOKEN")))
class tosca.ToscaInputs

Base class for defining TOSCA operation inputs.

Return type

None

class tosca.ToscaOutputs

Base class for defining TOSCA operation outputs.

Return type

None

class tosca.TopologyInputs(_name='', *, _metadata=<factory>)

Base class for defining topology template inputs. When converting to YAML, TopologyInputs subclass definitions will be rendered as the inputs section of the topology template. TopologyInputs subclass objects will be rendered as the input_values section of the service template.

Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

Return type

None

class tosca.TopologyOutputs(_name='', *, _metadata=<factory>)

Base class for defining topology template outputs. When converting to YAML, TopologyOutputs subclass definitions will be rendered asthe outputs section of the topology template.

Parameters
  • _name (str) –

  • _metadata (Dict[str, Union[None, int, float, str, bool, Sequence[Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]], Dict[str, Union[None, int, float, str, bool, Sequence[JsonType], Dict[str, JsonType]]]]]) –

Return type

None

class tosca.StandardOperationsKeywords

The names of tosca and unfurl’s built-in operations. set_operations uses this to provide type hint for its keyword arguments, ignore this when setting custom operations.

tosca.set_operations(obj: None = None, **kw: Unpack[StandardOperationsKeywords]) nodes.Root
tosca.set_operations(obj: _PT, **kwargs: Unpack[StandardOperationsKeywords]) _PT

Helper method to set operations on a TOSCA template using its keyword arguments’ as the operation names.

Parameters
Returns

The given or created template with the specified operations set.

Return type

ToscaType

class tosca.NodeTemplateDirective(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: str, enum.Enum

Enum of Node Template directives.

check = 'check'

Run check operation before deploying.

conditional = 'conditional'

Silently exclude from plan if one of its requirements is not met.

default = 'default'

Ignore this template if one with the same name is defined in the root topology.

dependent = 'dependent'

Exclude from plan if not referenced by other templates.

discover = 'discover'

Discover this template (instead of create).

protected = 'protected'

Block instance from deletion.

select = 'select'

Match the node with an instance in an external ensemble.

substitute = 'substitute'

Replace this node with a nested topology using Substitution Mappings.

virtual = 'virtual'

Don’t instantiate.

tosca.set_evaluation_mode(mode)

A context manager that sets the global (per-thread) tosca evaluation mode and restores the previous mode upon exit. This is only needed for testing or other special contexts.

Parameters

mode (str) – “parse” or “runtime”

Yields

the previous mode

with set_evaluation_mode("parse"):
    assert tosca.global_state_mode() == "parse"
tosca.safe_mode()

This function returns True if running within the Python safe mode sandbox.

Return type

bool

tosca.global_state_mode()

This function returns the execution state (either “parse” or “runtime”) that the current thread is in.

Returns “parse” or “runtime”

Return type

str

tosca.global_state_context()

This function returns orchestrator-specific runtime state for the current thread (or None).

Return type

Any

Scalars

This modules exposes TOSCA scalar units as Python _Unit objects.

Each unit name listed in SCALAR_UNIT_DICT can be imported from this module, e.g.:

>>> from tosca import mb
>>> one_mb = 1 * mb
>>> one_mb
1.0 MB
>>> one_mb.value
1000000.0
>>> one_mb.as_unit
1.0
>>> one_mb.to_yaml()
'1.0 MB'
class tosca.scalars.Bitrate(value, unit)
Parameters

unit (_Unit) –

SCALAR_UNIT_DICT: Dict[str, Any] = {'Gbps': 1000000000, 'Gibps': 1073741824, 'Kbps': 1000, 'Kibps': 1024, 'Mbps': 1000000, 'Mibps': 1048576, 'Tbps': 1000000000000, 'Tibps': 1099511627776, 'bps': 1}
tosca_name = 'scalar-unit.bitrate'
unit
value
class tosca.scalars.Frequency(value, unit)
Parameters

unit (_Unit) –

SCALAR_UNIT_DICT: Dict[str, Any] = {'GHz': 1000000000, 'Hz': 1, 'MHz': 1000000, 'kHz': 1000}
tosca_name = 'scalar-unit.frequency'
unit
value
class tosca.scalars.Size(value, unit)
Parameters

unit (_Unit) –

SCALAR_UNIT_DICT: Dict[str, Any] = {'B': 1, 'GB': 1000000000, 'GiB': 1073741824, 'KiB': 1024, 'MB': 1000000, 'MiB': 1048576, 'TB': 1000000000000, 'TiB': 1099511627776, 'kB': 1000}
tosca_name = 'scalar-unit.size'
unit
value
class tosca.scalars.Time(value, unit)
Parameters

unit (_Unit) –

SCALAR_UNIT_DICT: Dict[str, Any] = {'d': 86400, 'h': 3600, 'm': 60, 'ms': 0.001, 'ns': 1e-09, 's': 1, 'us': 1e-06}
tosca_name = 'scalar-unit.time'
unit
value
class tosca.scalars._Scalar(value, unit)
Parameters

unit (_Unit) –

SCALAR_UNIT_DICT: Dict[str, Any] = {}
as_ref(options=None)
property as_unit: float
to_yaml(dict_cls=<class 'dict'>)

Return this value and this type’s TOSCA unit suffix, eg. 10 kB

Return type

str

unit
value
class tosca.scalars._Unit(scalar_type, unit)
Parameters
  • scalar_type (Type[tosca.scalars._S]) –

  • unit (str) –

_round(s, ndigits)
Parameters
  • s (tosca.scalars._S) –

  • ndigits (Union[int, None, Literal['ceil'], typing.Literal['floor'], typing.Literal['round']]) –

Return type

Union[int, float]

as_float(s, ndigits=16)

Return the given scalar as an float denominated by this unit. If ndigits is given, round to that many digits.

Parameters
  • s (tosca.scalars._S) –

  • ndigits (int) –

Return type

float

as_int(s, round='round')

Return the given scalar as an integer denominated by this unit, rounding up or down to the nearest integer.

Parameters
  • s (tosca.scalars._S) –

  • round (Union[Literal['round'], typing.Literal['ceil'], typing.Literal['floor']]) –

Return type

int

tosca.scalars._unit(unit)

Return a unit object for the given unit name. Raise NameError if the unit is not found.

Parameters

unit (str) –

Return type

tosca.scalars._Unit

tosca.scalars.scalar(val)

Parse the string into a scalar or return None if parsing fails.

Supported syntax:

(+|-)?[digits]+ *? [unit] (spaces between number and unit is optional)

Parameters

val (Any) – value to attempt to parse (The value is first converted to string, so _Scalars can be used).

Return type

Optional[tosca.scalars._Scalar]

tosca.scalars.scalar_value(val_, unit=None, round=None)

Convert a scalar to a number denominated by the given unit. If the scalar_value is a string or a _Scalar, parse it as a scalar. If the scalar_value is a number assume it is already in the given unit. If the unit keyword is omitted, use the unit parsed from the scalar.

Return a float or an int depending on the “round” parameter: if a integer the number of digits to round to, or “ceil’, “floor” to round up or down to the nearest integer. If round is omitted or null, round to the nearest integer unless its absolute value is less than 1.

Parameters

val_ (Any) –

Return type

Optional[Union[float, int]]

tosca.scalars.unit(unit)

Return a unit object for the given unit name. Raise NameError if the unit is not found.

Parameters

unit (str) –

Return type

tosca.scalars._Unit

Utility Functions

This module contains utility functions that can be executed in "parse" mode (e.g. as part of a class definition or in _class_init_) and in the safe mode Python sandbox. Each of these are also available as Eval Expression Functions.

unfurl.tosca_plugins.functions.to_dns_label(arg: Union[str, list], *, allowed="'[a-zA-Z0-9-]'", start="'[a-zA-Z]'", replace="'--'", case="'lower'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) str
unfurl.tosca_plugins.functions.to_dns_label(arg: Mapping, *, allowed="'[a-zA-Z0-9-]'", start="'[a-zA-Z]'", replace="'--'", case="'lower'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) dict

Convert the given argument (see unfurl.tosca_plugins.functions.to_label() for full description) to a DNS label (a label is the name separated by “.” in a domain name). The maximum length of each label is 63 characters and can include alphanumeric characters and hyphens but a domain name must not commence or end with a hyphen.

Invalid characters are replaced with “–“.

unfurl.tosca_plugins.functions.to_googlecloud_label(arg: Union[str, list], *, allowed="'\\\\w-'", case="'lower'", replace="'__'", start="'[a-zA-Z]'", max='63', **kw: Unpack[DNSLabelKwArgs]) str
unfurl.tosca_plugins.functions.to_googlecloud_label(arg: Mapping, *, allowed="'\\\\w-'", case="'lower'", replace="'__'", start="'[a-zA-Z]'", max='63', **kw: Unpack[DNSLabelKwArgs]) dict

See https://cloud.google.com/resource-manager/docs/labels-overview

Invalid characters are replaced with “__”.

unfurl.tosca_plugins.functions.to_kubernetes_label(arg: Union[str, list], *, allowed="'\\\\w.-'", case="'any'", replace="'__'", start="'[a-zA-Z0-9]'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) str
unfurl.tosca_plugins.functions.to_kubernetes_label(arg: Mapping, *, allowed="'\\\\w.-'", case="'any'", replace="'__'", start="'[a-zA-Z0-9]'", end="'[a-zA-Z0-9]'", max='63', **kw: Unpack[DNSLabelKwArgs]) dict

See https://kubernetes.io/docs/concepts/overview/working-with-objects/labels/#syntax-and-character-set

Invalid characters are replaced with “__”.

unfurl.tosca_plugins.functions.to_label(arg: Union[str, list], **kw: Unpack[LabelKwArgs]) str
unfurl.tosca_plugins.functions.to_label(arg: Mapping, **kw: Unpack[LabelKwArgs]) dict
Convert a string to a label with the given constraints.

If a dictionary, all keys and string values are converted. If list, to_label is applied to each item and concatenated using sep

Parameters
  • arg (str or dict or list) – Convert to label

  • allowed (str, optional) – Allowed characters. Regex character ranges and character classes. Defaults to “w” (equivalent to [a-zA-Z0-9_])

  • replace (str, optional) – String Invalidate. Defaults to “” (remove the characters).

  • start (str, optional) – Allowed characters for the first character. Regex character ranges and character classes. Defaults to “a-zA-Z”

  • start_prepend (str, optional) – If the start character is invalid, prepend with this string (Default: “x”)

  • end (str, optional) – Allowed trailing characters. Regex character ranges and character classes. If set, invalid characters are stripped.

  • max (int, optional) – max length of label. Defaults to 63 (the maximum for a DNS name).

  • case (str, optional) – “upper”, “lower” or “any” (no conversion). Defaults to “any”.

  • sep (str, optional) – Separator to use when concatenating a list. Defaults to “”

  • digestlen (int, optional) – If a label is truncated, the length of the digest to include in the label. 0 to disable. Default: 3 or 2 if max < 32

Eval Expression Functions

Type-safe equivalents to Unfurl’s Eval Expression Functions.

When called in "parse" mode (e.g. as part of a class definition or in _class_init_) they will return eval expression that will get executed. But note that the type signature will match the result of the expression, not the eval expression itself. (This type punning enables effective static type checking).

When called in runtime mode (ie. as a computed property or as operation implementation) they perform the equivalent functionality.

These functions can be executed in the safe mode Python sandbox as it always executes in “parse” mode.

Note that some functions are overloaded with two signatures, One that takes a live ToscaType object as an argument and one that takes None in its place.

The former variant can only be used in runtime mode as live objects are not available outside that mode. In “parse” mode, the None variant must be used and at runtime the eval expression returned by that function will be evaluated using the current context’s instance.

User-defined functions can be made available as an expression functions by the runtime_func decorator.

unfurl.tosca_plugins.expr.abspath(obj: ToscaType, path: str, relativeTo=None, mkdir=False) FilePath
unfurl.tosca_plugins.expr.abspath(obj: None, path: str, relativeTo=None, mkdir=False) str
unfurl.tosca_plugins.expr.and_expr(left, right)
Parameters
  • left (unfurl.tosca_plugins.expr.T) –

  • right (unfurl.tosca_plugins.expr.U) –

Return type

Union[unfurl.tosca_plugins.expr.T, unfurl.tosca_plugins.expr.U]

unfurl.tosca_plugins.expr.as_bool(val)
Return type

bool

unfurl.tosca_plugins.expr.concat(*args, sep='')
Parameters

args (str) –

Return type

str

unfurl.tosca_plugins.expr.find_connection(target, rel_type=<class 'tosca.builtin_types.relationships.ConnectsTo'>)
Parameters
  • target (Optional[tosca._tosca.Node]) –

  • rel_type (Type[unfurl.tosca_plugins.expr.RI]) –

Return type

Optional[unfurl.tosca_plugins.expr.RI]

unfurl.tosca_plugins.expr.get_context(obj, kw=None)
Parameters
Return type

unfurl.eval.RefContext

unfurl.tosca_plugins.expr.get_dir(obj: ToscaType, relativeTo=None, mkdir=False) FilePath
unfurl.tosca_plugins.expr.get_dir(obj: None, relativeTo=None, mkdir=False) str
unfurl.tosca_plugins.expr.get_ensemble_metadata(key: None = None) Dict[str, str]
unfurl.tosca_plugins.expr.get_ensemble_metadata(key: str) str
unfurl.tosca_plugins.expr.get_env(name: str, default: str, *, ctx='None') str
unfurl.tosca_plugins.expr.get_env(name: str, *, ctx='None') Optional[str]
unfurl.tosca_plugins.expr.get_env(*, ctx='None') Dict[str, str]
unfurl.tosca_plugins.expr.get_input(name: str, default: TI) TI
unfurl.tosca_plugins.expr.get_input(name: str) Any
unfurl.tosca_plugins.expr.get_instance(obj: Node) NodeInstance
unfurl.tosca_plugins.expr.get_instance(obj: Relationship) RelationshipInstance
unfurl.tosca_plugins.expr.get_instance(obj: ArtifactEntity) ArtifactInstance
unfurl.tosca_plugins.expr.get_instance(obj: CapabilityEntity) CapabilityInstance
unfurl.tosca_plugins.expr.get_instance(obj: ToscaType) EntityInstance

Returns the instance the given TOSCA template is proxying. If not in "runtime" mode or the template is not proxying an instance an exception is raised.

unfurl.tosca_plugins.expr.get_instance_maybe(obj: Node) Optional[NodeInstance]
unfurl.tosca_plugins.expr.get_instance_maybe(obj: Relationship) Optional[RelationshipInstance]
unfurl.tosca_plugins.expr.get_instance_maybe(obj: ArtifactEntity) Optional[ArtifactInstance]
unfurl.tosca_plugins.expr.get_instance_maybe(obj: CapabilityEntity) Optional[CapabilityInstance]
unfurl.tosca_plugins.expr.get_instance_maybe(obj: ToscaType) Optional[EntityInstance]

In "runtime" mode return the instance the given TOSCA template is proxying, otherwise return None.

unfurl.tosca_plugins.expr.get_nodes_of_type(cls)
Parameters

cls (Type[tosca._tosca.ToscaType]) –

Return type

list

unfurl.tosca_plugins.expr.has_env(name)
Parameters

name (str) –

Return type

bool

unfurl.tosca_plugins.expr.if_expr(if_cond, then, otherwise=None)

Returns an eval expression like:

{"eval": {"if": if_cond, "then": then, "else": otherwise}

This will not evaluate at runtime mode because all arguments will evaluated before calling this function, defeating eval expressions’ (and Python’s) short-circuit semantics. To avoid unexpected behavior, an error will be raised if invoked during runtime mode. Instead just use a Python ‘if’ statement or expression.

Parameters
  • then (unfurl.tosca_plugins.expr.T) –

  • otherwise (unfurl.tosca_plugins.expr.U) –

Return type

Union[unfurl.tosca_plugins.expr.T, unfurl.tosca_plugins.expr.U]

unfurl.tosca_plugins.expr.lookup(name, *args, **kwargs)
Parameters

name (str) –

unfurl.tosca_plugins.expr.not_(val)
Return type

bool

unfurl.tosca_plugins.expr.or_expr(left, right)
Parameters
  • left (unfurl.tosca_plugins.expr.T) –

  • right (unfurl.tosca_plugins.expr.U) –

Return type

Union[unfurl.tosca_plugins.expr.T, unfurl.tosca_plugins.expr.U]

unfurl.tosca_plugins.expr.read(path, encoding=None)

Equivalent to an file eval expression using contents to read the file.

Parameters
  • path (str) –

  • encoding (Optional[str]) –

Return type

Union[str, bytes]

unfurl.tosca_plugins.expr.runtime_func(_func: None = None) Callable[[F], F]
unfurl.tosca_plugins.expr.runtime_func(_func: F) F

A decorator for making a function invocable as a runtime expression. When the decorated function invoked in "parse" mode, if any of its arguments contain tosca.EvalData, then the function will return tosca.EvalData containing a JSON representation of the invocation as an expression function that will be evaluated at runtime. Otherwise, the function will eagerly execute as a normal Python function.

unfurl.tosca_plugins.expr.super()

Equivalent to the .super eval expression key. Returns a map of the current instances’s attributes as a view of its nearest inherited type. (Python’s super() won’t work in "runtime" mode when proxying an instance.)

For example:

class Base(Node):
    prop: str = "default"

class Derived(Base):
    def _prop(self):
        return "prepend-" + expr.super()["prop"]

    prop: str = Eval(_prop)  # evaluates to "prepend-default"
Return type

MutableMapping[str, Any]

unfurl.tosca_plugins.expr.tempfile(contents, suffix='', encoding=None)
Parameters

contents (Any) –

unfurl.tosca_plugins.expr.template(obj=None, *, path='', contents='', overrides=None, vars=None)
Parameters
  • obj (Optional[tosca._tosca.ToscaType]) –

  • path (str) –

  • contents (str) –

  • overrides (Optional[Dict[str, str]]) –

  • vars (Optional[Dict[str, Any]]) –

Return type

Any

unfurl.tosca_plugins.expr.to_env(args, update_os_environ=False)
Parameters

args (Dict[str, str]) –

Return type

Dict[str, str]

unfurl.tosca_plugins.expr.token(string, token, index)
Parameters
  • string (str) –

  • token (str) –

  • index (int) –

Return type

str

unfurl.tosca_plugins.expr.uri(obj=None)
Parameters

obj (Optional[tosca._tosca.ToscaType]) –

Return type

Optional[str]

API for writing configurators

Configurators

class unfurl.configurator.Configurator(*args, **kw)

Base class for implementing Configurators. Subclasses should at least implement a run(), for example:

class MinimalConfigurator(Configurator):
    def run(self, task):
        assert self.can_run(task)
        return Status.ok
Parameters

args (tosca._tosca.ToscaInputs) –

Return type

None

attribute_output_metadata_key: Optional[str] = None
can_dry_run(task)

Returns whether this configurator can handle a dry-runs for the given task. (And should check TaskView.dry_run in during run().

Parameters

task (TaskView) – The task about to be run.

Returns

bool

Return type

bool

can_run(task)

Return whether or not the configurator can execute the given task depending on if this configurator support the requested action and parameters and given the current state of the target instance?

Parameters

task (TaskView) – The task that is about to be run.

Returns

Should return True or a message describing why the task couldn’t be run.

Return type

(bool or str)

check_digest(task, changeset)

Examine the previous ChangeRecord generated by the previous time this operation was performed on the target instance and return whether it should be rerun or not.

The default implementation recalculates the digest of input parameters that were accessed in the previous run.

Parameters
  • task (TaskView) – The task that might execute this operation.

  • changeset (ChangeRecordRecord) – The task that might execute this operation.

Returns

True if configuration’s digest has changed, False if it is the same.

Return type

bool

exclude_from_digest: Tuple[str, ...] = ()
classmethod get_dry_run(inputs, template)
Parameters

template (unfurl.spec.EntitySpec) –

Return type

bool

get_generator(task)
Parameters

task (unfurl.configurator.TaskView) –

Return type

Generator

render(task)

This method is called during the planning phase to give the configurator an opportunity to do early validation and error detection and generate any plan information or configuration files that the user may want to review before the running the deployment task.

Property access and writes will be tracked and used to establish dynamic dependencies between instances so the plan can be ordered properly. Any updates made to instances maybe reverted if it has dependencies on attributes that might be changed later in the plan, so this method should be idempotent.

Returns

The value returned here will subsequently be available as task.rendered

Parameters

task (unfurl.configurator.TaskView) –

Return type

Any

run(task)

Subclasses of Configurator need to implement this method. It should perform the operation specified in the ConfigurationSpec on the task.target. It can be either a generator that yields one or more JobRequest or TaskRequest or a regular function.

Parameters

task (TaskView) – The task currently running.

Yields

Optionally run can yield either a JobRequest, TaskRequest to run subtasks and finally a ConfiguratorResult when done

Returns

If run is not defined as a generator it must return either a Status, a bool or a ConfiguratorResult to indicate if the task succeeded and any changes to the target’s state.

Return type

Union[Generator, unfurl.configurator.ConfiguratorResult, unfurl.support.Status, bool]

save_digest(task)

Generate a compact, deterministic representation of the current configuration. This is saved in the job log and used by check_digest in subsequent jobs to determine if the configuration changed the operation needs to be re-run.

The default implementation calculates a SHA1 digest of the values of the inputs that where accessed while that task was run, with the exception of the input parameters listed in exclude_from_digest.

Parameters

task (TaskView) –

Returns

A dictionary whose keys are strings that start with “digest”

Return type

dict

classmethod set_config_spec_args(kw, template)
Parameters
  • kw (dict) –

  • template (unfurl.spec.EntitySpec) –

Return type

dict

short_name: Optional[str] = None

shortName can be used to customize the “short name” of the configurator as an alternative to using the full name (“module.class”) when setting the implementation on an operation. (Titlecase recommended)

should_run(task)

Does this configuration need to be run?

Parameters

task (unfurl.configurator.TaskView) –

Return type

Union[bool, unfurl.support.Priority]

class unfurl.configurator.ConfiguratorResult(success, modified, status=None, result=None, outputs=None, exception=None)

Represents the result of a task that ran.

See TaskView.done() for more documentation.

Parameters
Return type

None

class unfurl.configurator.JobRequest(resources, errors=None, update=None)

Yield this to run a child job.

Parameters
get_instance_specs()
property name
property root
set_error(msg)
Parameters

msg (str) –

property target
class unfurl.configurator.TaskRequest(configSpec, target, reason, persist=False, required=None, startState=None)

Yield this to run a child task. (see unfurl.configurator.TaskView.create_sub_task())

Parameters
  • configSpec (unfurl.planrequests.ConfigurationSpec) –

  • target (unfurl.runtime.EntityInstance) –

  • reason (str) –

  • persist (bool) –

  • required (Optional[bool]) –

  • startState (Optional[unfurl.support.NodeState]) –

property completed: bool

bool(x) -> bool

Returns True when the argument x is true, False otherwise. The builtins True and False are the only two instances of the class bool. The class bool is a subclass of the class int, and cannot be subclassed.

finish_workflow()
Return type

None

get_operation_artifacts()
Return type

List[unfurl.planrequests.JobRequest]

property name
reassign_final_for_workflow()
Return type

Optional[unfurl.planrequests.TaskRequest]

class unfurl.configurator.TaskView(manifest, configSpec, target, reason=None, dependencies=None)

The interface presented to configurators.

The following public attributes are available:

Parameters
  • manifest (Manifest) –

  • configSpec (unfurl.planrequests.ConfigurationSpec) –

  • target (unfurl.runtime.EntityInstance) –

  • reason (Optional[str]) –

  • dependencies (Optional[List[Operational]]) –

Return type

None

target

The instance this task is operating on.

configSpec
Type

ConfigurationSpec

reason

The reason this operation was planned. See Reason

Type

str

cwd

Current working directory

Type

str

dry_run

Dry run only

Type

bool

verbose

Verbosity level set for this job (-1 error, 0 normal, 1 verbose, 2 debug)

Type

int

add_dependency(expr, expected=None, schema=None, name=None, required=True, wantList=False, target=None, write_only=None)
Parameters
  • expr (Union[str, collections.abc.Mapping]) –

  • expected (Optional[Union[list, unfurl.result.ResultsList, unfurl.result.Result]]) –

  • schema (Optional[collections.abc.Mapping]) –

  • name (Optional[str]) –

  • required (bool) –

  • wantList (bool) –

  • target (Optional[unfurl.runtime.EntityInstance]) –

  • write_only (Optional[bool]) –

Return type

unfurl.configurator.Dependency

add_message(message)
Parameters

message (object) –

Return type

None

apply_work_folders(*names)
Parameters

names (str) –

Return type

None

property connections: unfurl.configurator._ConnectionsMap
create_sub_task(operation=None, resource=None, inputs=None, persist=False, required=None)

Create a subtask that will be executed if yielded by unfurl.configurator.Configurator.run()

Parameters
  • operation (str) – The operation call (like interface.operation)

  • resource (NodeInstance) –

  • inputs (Optional[dict]) –

  • persist (bool) –

  • required (Optional[bool]) –

Returns

TaskRequest

Return type

Optional[unfurl.planrequests.TaskRequest]

discard_work_folders()
Return type

None

done(success=None, modified=None, status=None, result=None, outputs=None, captureException=None)

unfurl.configurator.Configurator.run() should call this method and return or yield its return value before terminating.

>>> yield task.done(True)
Parameters
  • success (bool) – indicates if this operation completed without an error.

  • modified (bool) – (optional) indicates whether the physical instance was modified by this operation.

  • status (Status) – (optional) should be set if the operation changed the operational status of the target instance. If not specified, the runtime will updated the instance status as needed, based the operation preformed and observed changes to the instance (attributes changed).

  • result (dict) – (optional) A dictionary that will be serialized as YAML into the changelog, can contain any useful data about these operation.

  • outputs (dict) – (optional) Operation outputs, as specified in the topology template.

  • captureException (Optional[object]) –

Returns

ConfiguratorResult

Return type

unfurl.configurator.ConfiguratorResult

property environ: Dict[str, str]
fail_work_folders()
Return type

None

static find_connection(ctx, target, relation='tosca.relationships.ConnectsTo')

Find a relationship that this task can use to connect to the given instance. First look for relationship between the task’s target instance and the given instance. If none is found, see if there a default connection of the given type.

Parameters
  • target (NodeInstance) – The instance to connect to.

  • relation (str, optional) – The relationship type. Defaults to tosca.relationships.ConnectsTo.

  • ctx (unfurl.eval.RefContext) –

Returns

The connection instance.

Return type

RelationshipInstance or None

find_instance(name)
Parameters

name (str) –

Return type

Optional[unfurl.runtime.NodeInstance]

get_environment(addOnly, env=None)

Return a dictionary of environment variables applicable to this task.

Parameters
  • addOnly (bool) – If addOnly is False all variables in the current os environment will be included otherwise only variables added will be included.

  • env (Optional[dict]) –

Returns

dict:

Return type

dict

Variable sources (by order of preference, lowest to highest): 1. The ensemble’s environment 2. Variables set by the connections that are available to this operation. 3. Variables declared in the operation’s environment section.

get_settings()
Return type

dict

get_work_folder(location=None)
Parameters

location (Optional[str]) –

Return type

unfurl.projectpaths.WorkFolder

property inputs: unfurl.result.ResultsMap

Exposes inputs and task settings as expression variables, so they can be accessed like:

eval: $inputs::param

or in jinja2 templates:

{{ inputs.param }}

query(query, dependency=False, name=None, required=False, wantList=False, strict=True, vars=None, throw=False, trace=None)

Executes a query using this task’s current context and returns the result.

Parameters
  • query (Union[str, dict]) – The query to be executed. Can be a string or a dictionary.

  • dependency (bool, optional) – If True, saves the query result as a dependency. Defaults to False.

  • name (Optional[str], optional) – The name of the dependency. Defaults to None.

  • required (bool, optional) – If True, marks the dependency as required. Defaults to False.

  • wantList (bool, optional) – If True, expects the result to be a list. Defaults to False.

  • strict (bool, optional) – If True, enforces strict resolution rules. Defaults to True.

  • vars (Optional[dict], optional) – Variables to be added to the query context. Defaults to None.

  • throw (bool, optional) – If True, raises an exception on error, otherwise add to task errors and returns None. Defaults to False.

  • trace (Optional[int], optional) – Trace level for debugging. Defaults to None.

Returns

The result of the query, which can be of various types depending on the query and options provided. Returns None if an error occurs and throw is False.

Return type

Union[Any, Result, List[Result], None]

Raises

Exception – If an error occurs during query evaluation and throw is True.

remove_dependency(name)
Parameters

name (str) –

Return type

Optional[unfurl.configurator.Dependency]

restore_envvars()

Restore the os.environ to the environment’s state before this task ran.

sensitive(value)

Mark the given value as sensitive. Sensitive values will be encrypted or redacted when outputed.

Returns

A copy of the value converted the appropriate subtype of unfurl.logs.sensitive value or the value itself if it can’t be converted.

Return type

sensitive

Parameters

value (object) –

set_envvars()

Update os.environ with the task’s environment variables and save the current one so it can be restored by restore_envvars.

set_work_folder(location='operation', preserve=None, always_apply=False)
Parameters
  • location (str) –

  • preserve (Optional[bool]) –

  • always_apply (bool) –

Return type

unfurl.projectpaths.WorkFolder

update_instances(instances)

Notify Unfurl of new or changed instances made while the task is running.

This will queue a new child job if needed. To immediately run the child job based on the supplied spec, yield the returned JobRequest.

Parameters

instances (Union[str, List[Dict[str, Any]]]) – Either a list or a string that is parsed as YAML.

Return type

Tuple[Optional[unfurl.planrequests.JobRequest], List[unfurl.util.UnfurlTaskError]]

For example, this snipped creates a new instance and modifies the current target instance.

# create a new instance:
- name:     name-of-new-instance
  parent:   HOST # or SELF or <instance name>
  # all other fields should match the YAML in an ensemble's status section
  template: aNodeTemplate
  attributes:
     anAttribute: aValue
  readyState:
    local: ok
    state: started
# modify an existing instance:
- name: SELF
  # the following fields are supported (all are optional):
  template: aNodeTemplate
  attributes:
      anAttribute: aNewValue
      ...
  artifacts:
      artifact1:
        ...
  readyState:
    local: ok
    state: started
  protected: true
  customized: true
property vars: dict

A dictionary of the same variables that are available to expressions when evaluating inputs.

Internal classes supporting the runtime.

class unfurl.support.NodeState(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: int, enum.Enum

An enumeration representing TOSCA Node States.

configured = 5
configuring = 4
created = 3
creating = 2
deleted = 11
deleting = 10
error = 12
initial = 1
started = 7
starting = 6
stopped = 9
stopping = 8
class unfurl.support.Priority(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: int, enum.Enum

critical = 3
ignore = 0
optional = 1
required = 2
class unfurl.support.Reason(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: str, enum.Enum

add = 'add'
check = 'check'
connect = 'connect'
degraded = 'degraded'
error = 'error'

Synonym for repair. Deprecated.

force = 'force'
missing = 'missing'
prune = 'prune'
reconfigure = 'reconfigure'
repair = 'repair'
run = 'run'
stop = 'stop'
subtask = 'subtask'
undeploy = 'undeploy'
update = 'update'
upgrade = 'upgrade'
class unfurl.support.Status(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: int, enum.Enum

absent = 5

Instance confirmed to not exist.

degraded = 2

Instance is operational but in a degraded state.

error = 3

Instance is not operational.

ok = 1

Instance is operational

pending = 4

Instance is being brought up or hasn’t been created yet.

unknown = 0

The operational state of the instance is unknown.

class unfurl.result.ChangeRecord(jobId=None, startTime=None, taskId=0, previousId=None, parse=None)

A ChangeRecord represents a job or task in the change log file. It consists of a change ID and named attributes.

A change ID is an identifier with this sequence of 12 characters: - “A” serves as a format version identifier - 7 alphanumeric characters (0-9, A-Z, and a-z) encoding the date and time the job ran. - 4 hexadecimal digits encoding the task id

Parameters
  • jobId (Optional[str]) –

  • startTime (Optional[datetime.datetime]) –

  • taskId (int) –

  • previousId (Optional[str]) –

  • parse (Optional[str]) –

classmethod format_log(changeId, attributes)

format: changeidtkey=valuetkey=value

Parameters
  • changeId (str) –

  • attributes (dict) –

Return type

str

log(attributes=None)

changeidtkey=valuetkey=value

Parameters

attributes (Optional[dict]) –

Return type

str

Project folders

An ensemble may contain create the following directories when a job runs:

artifacts

Artifacts required for deployment (e.g. Terraform state files).

secrets

sensitive artifacts (e.g. certificates). They are vault encrypted in the repository.

local

Artifacts specific to this installation and so excluded from the repository (e.g. a pidfile)

tasks

The most recently generated configuration files for each instance (for informational purposes only – excluded from repository and safe to delete).

Each of these directories will contain subdirectories named after each instance in the ensemble; their contents are populated as those instances are deployed.

When a plan is being generated, a directory named “planned” will be created which will have the same directory structure as above. When the job executes the plan those files will be moved to the corresponding directory if the task successfully deployed, otherwise it will be moved to a directory named failed/<changeid>.

When a task runs, its configurator has access to these directories that it can use to store artifacts in the ensemble’s repository or for generating local configuration files. For this, each deployed instance can have its own set of directories (see _get_base_dir()).

Because generating a plan should not impact what is currently deployed, during the during planning and rendering phase, a configurator can use the WorkFolder interface to read and write from temporary copies of those folders in the “planned” directory. They will either be discarded or moved to “active” if the task fails or succeeds.

This also enables the files generated by the plan to be manually examined – useful for development, error diagnosis and user intervention, or as part of a git-based approval process.

class unfurl.projectpaths.WorkFolder(task, location, preserve)

When a task is running, this class provides access to the directories associated with instances and tasks, such as the directories for reading and writing artifacts and secrets.

Updates to these directories through this class are performed transactionally – accessing a directory through this class marks it for writing and creates a copy of it in the planned directory.

If a task completes successfully apply() is called, which copies it back to the permanent location of the folder in the ensemble’s “active” directory.

Parameters
  • task (TaskView) –

  • location (str) –

  • preserve (bool) –

always_apply = False
apply()
Return type

str

copy_from(path)
copy_to(path)
property cwd
discard()
Return type

None

failed()
Return type

str

get_current_path(path, mkdir=True)
pending_path(path=None, mkdir=True)

An absolute path to the planning location of this directory.

permanent_path(path, mkdir=True)

An absolute path to the permanent location of this directory.

relpath_to_current(path)
write_file(contents, name, encoding=None)

Create a file with the given contents

Parameters
  • contents

    .

  • name (string) – Relative path to write to.

  • encoding (string) – (Optional) One of “binary”, “vault”, “json”, “yaml” or an encoding registered with the Python codec registry.

Returns

An absolute path to the file.

Return type

str

unfurl.projectpaths._get_base_dir(ctx, name=None)

Returns an absolute path based on the given folder name:

.

directory that contains the current instance’s ensemble

Src

directory of the source file this expression appears in

Artifacts

directory for the current instance (committed to repository).

Local

The “local” directory for the current instance (excluded from repository)

Secrets

The “secrets” directory for the current instance (files written there are vault encrypted)

Tmp

A temporary directory for the current instance (removed after unfurl exits)

Tasks

Job specific directory for the current instance (excluded from repository).

Operation

Operation specific directory for the current instance (excluded from repository).

Workflow

Workflow specific directory for the current instance (excluded from repository).

Spec.src

The directory of the source file the current instance’s template appears in.

Spec.home

Directory unique to the current instance’s TOSCA template (committed to the spec repository).

Spec.local

Local directory unique to the current instance’s TOSCA template (excluded from repository).

Project

The root directory of the current project.

Project.secrets

The “secrets” directory for the current project (files written there are vault encrypted)

Unfurl.home

The location of home project (UNFURL_HOME).

Repository.<name>

The location of the repository with the given name

Otherwise look for a repository with the given name and return its path or None if not found.

Runtime module

This module defines the core model and implements the runtime operations of the model.

The state of the system is represented as a collection of Instances. Each instance have a status; attributes that describe its state; and a TOSCA template which describes its capabilities, relationships and available interfaces for configuring and interacting with it.

class unfurl.runtime.Operational

This is an abstract base class for Jobs, Resources, and Configurations all have a Status associated with them and all use the same algorithm to compute their status from their dependent resources, tasks, and configurations

static aggregate_status(statuses, seen)

Returns: ok, degraded, pending or None

If there are no instances, return None If any required are not operational, return pending or error If any other are not operational or degraded, return degraded Otherwise return ok. (Instances with priority set to “ignore” are ignored.)

Parameters
Return type

Optional[unfurl.support.Status]

get_operational_dependencies()

Return an iterator of Operational objects that this instance directly depends on to be operational.

Return type

Iterable[unfurl.runtime.Operational]

get_operational_dependents()

Return an iterator of Operational objects that directly depend on this instance to be operational.

Return type

Iterable[unfurl.runtime.Operational]

has_changed(changeset)

Whether or not this object changed since the give ChangeRecord.

Parameters

changeset (Optional[unfurl.result.ChangeRecord]) –

Return type

bool

class unfurl.runtime.OperationalInstance(status=None, priority=None, manualOveride=None, lastStateChange=None, lastConfigChange=None, state=None)

A concrete implementation of Operational

Parameters
Return type

None

get_operational_dependencies()

Return an iterator of Operational objects that this instance directly depends on to be operational.

Return type

Iterable[unfurl.runtime.Operational]

property local_status: Optional[unfurl.support.Status]

The local_status property.

property manual_override_status: Optional[unfurl.support.Status]

The manualOverideStatus property.

property priority: Optional[unfurl.support.Priority]

The priority property.

property state: Optional[unfurl.support.NodeState]

The state property.

APIs for controlling Unfurl

Localenv module

Classes for managing the local environment.

Repositories can optionally be organized into projects that have a local configuration.

By convention, the “home” project defines a localhost instance and adds it to its context.

class unfurl.localenv.LocalEnv(manifestPath=None, homePath=None, parent=None, project=None, can_be_empty=False, override_context=None, overrides=None, readonly=False)

This class represents the local environment that an ensemble runs in, including the local project it is part of and the home project.

Parameters
  • manifestPath (Optional[str]) –

  • homePath (Optional[str]) –

  • parent (Optional[unfurl.localenv.LocalEnv]) –

  • project (Optional[unfurl.localenv.Project]) –

  • can_be_empty (bool) –

  • override_context (Optional[str]) –

  • overrides (Optional[Dict[str, Any]]) –

  • readonly (Optional[bool]) –

Return type

None

find_git_repo(repoURL, revision=None)
Parameters
  • repoURL (str) –

  • revision (Optional[str]) –

Return type

Optional[unfurl.repo.GitRepo]

find_or_create_working_dir(repoURL, revision=None, basepath=None, checkout_args={}, locked=False)
Parameters
  • repoURL (str) –

  • revision (Optional[str]) –

  • basepath (Optional[str]) –

  • checkout_args (dict) –

  • locked (bool) –

Return type

Tuple[Optional[unfurl.repo.GitRepo], Optional[str], Optional[bool]]

find_path_in_repos(path, importLoader=None)

If the given path is part of the working directory of a git repository return that repository and a path relative to it

Parameters
  • path (str) –

  • importLoader (Optional[Any]) –

Return type

Tuple[Optional[unfurl.repo.GitRepo], Optional[str], Optional[str], Optional[bool]]

find_project(testPath, stopPath=None)

Walk parents looking for unfurl.yaml

Parameters
  • testPath (str) –

  • stopPath (Optional[str]) –

Return type

Optional[unfurl.localenv.Project]

get_context(context=None)

Return a new context that merges the given context with the local context.

Parameters

context (Optional[dict]) –

Return type

Dict[str, Any]

get_external_manifest(location, skip_validation, safe_mode)
Parameters
  • location (dict) –

  • skip_validation (bool) –

  • safe_mode (bool) –

Return type

Optional[YamlManifest]

get_local_instance(name, context)
Parameters
  • name (str) –

  • context (dict) –

Return type

Tuple[unfurl.runtime.NodeInstance, dict]

get_manifest(path=None, skip_validation=False, safe_mode=None)
Parameters
  • path (Optional[str]) –

  • skip_validation (bool) –

  • safe_mode (Optional[bool]) –

Return type

YamlManifest

get_paths()

Return a list of directories for $PATH. Includes the directory the unfurl script is installed in and, if asdf is installed, appends a PATH list from the .toolversions found in the current project and the home project.

Return type

List[str]

get_project(path, homeProject)
Parameters
Return type

unfurl.localenv.Project

static get_runtime(ensemble_path, home_path)
Parameters
  • ensemble_path (Optional[str]) –

  • home_path (Optional[str]) –

Return type

Optional[str]

get_vault()
get_vault_password(vaultId='default')
Parameters

vaultId (str) –

Return type

Optional[str]

Parameters
  • base_path (str) –

  • name (str) –

  • url (str) –

Return type

Tuple[str, str]

map_value(val, env_rules)

Evaluate using project home as a base dir.

Parameters
  • val (Any) –

  • env_rules (Optional[dict]) –

Return type

Any

parent: Optional[unfurl.localenv.LocalEnv] = None
project: Optional[unfurl.localenv.Project] = None
class unfurl.localenv.Project(path, homeProject=None, overrides=None, readonly=False, register=False)

A Unfurl project is a folder that contains at least a local configuration file (unfurl.yaml), one or more ensemble.yaml files which maybe optionally organized into one or more git repositories.

Parameters
  • path (str) –

  • homeProject (Optional[Project]) –

  • overrides (Optional[dict]) –

  • readonly (Optional[bool]) –

  • register (bool) –

add_context(name, value)
Parameters
  • name (str) –

  • value (dict) –

adjust_manifest_path(location, local_env)
Parameters
Return type

str

create_working_dir(gitUrl, ref=None)
Parameters
  • gitUrl (str) –

  • ref (Optional[str]) –

Return type

unfurl.repo.GitRepo

find_ensemble_by_name(name)
Parameters

name (str) –

Return type

Optional[dict]

find_ensemble_by_path(path)
Parameters

path (str) –

Return type

Optional[dict]

find_git_repo(repoURL, revision=None)
Parameters
  • repoURL (str) –

  • revision (Optional[str]) –

Return type

Optional[unfurl.repo.GitRepo]

find_git_repo_from_repository(repoSpec)
Parameters

repoSpec (toscaparser.repositories.Repository) –

Return type

Optional[unfurl.repo.GitRepo]

find_or_clone(repo)
Parameters

repo (unfurl.repo.GitRepo) –

Return type

unfurl.repo.GitRepo

find_or_create_working_dir(repoURL, revision=None)
Parameters
  • repoURL (str) –

  • revision (Optional[str]) –

Return type

unfurl.repo.GitRepo

static find_path(testPath, stopPath=None)

Walk parents looking for unfurl.yaml

Parameters
  • testPath (str) –

  • stopPath (Optional[str]) –

Return type

Optional[str]

find_path_in_repos(path, importLoader=None)

If the given path is part of the working directory of a git repository return that repository and a path relative to it

Parameters
  • path (str) –

  • importLoader (Optional[Any]) –

Return type

Tuple[Optional[unfurl.repo.RepoView], Optional[str], Optional[bool]]

static get_asdf_paths(projectRoot, asdfDataDir, toolVersions={})
Return type

List[str]

get_context(contextName, context=None)
Parameters
  • contextName (Optional[str]) –

  • context (Optional[dict]) –

Return type

dict

get_default_context()
Return type

Optional[str]

get_default_project_path(context_name)

If there is a default project set for the given environment, return its path.

Parameters

context_name (str) –

Return type

Optional[str]

get_managed_project(location, localEnv)
Parameters
Return type

Optional[unfurl.localenv.Project]

static get_name_from_dir(projectRoot)
Parameters

projectRoot (str) –

Return type

str

get_relative_path(path)
Parameters

path (str) –

Return type

str

get_unique_path(name)
Parameters

name (str) –

Return type

str

get_vault_password(contextName=None, vaultId='default')
Parameters
  • contextName (Optional[str]) –

  • vaultId (str) –

Return type

Optional[str]

get_vault_passwords(contextName=None)
Parameters

contextName (Optional[str]) –

Return type

Iterable[Tuple[str, Union[str, bytes]]]

has_ensembles()
Return type

bool

is_path_in_project(path)
Parameters

path (str) –

Return type

bool

load_yaml_include(yamlConfig, templatePath, baseDir, warnWhenNotFound=False, expanded=None, action=None)

This is called while the YAML config is being loaded. Returns (url or fullpath, parsed yaml)

Parameters
  • yamlConfig (unfurl.yamlloader.YamlConfig) –

  • templatePath (Union[str, dict]) –

  • action (Optional[unfurl.yamlloader.LoadIncludeAction]) –

make_vault_lib(contextName=None)
Parameters

contextName (Optional[str]) –

Return type

Optional[ansible.parsing.vault.VaultLib]

property name: str
static normalize_path(path)
Parameters

path (str) –

Return type

str

project_repoview: unfurl.repo.RepoView
register_ensemble(manifestPath, *, project=None, managedBy=None, context=None, local=False, default=False)
Parameters
Return type

bool

register_project(project, save=False)

Register a project with this project. Return True if this project’s config file was updated and saved to disk.

Parameters
Return type

bool

reload()
search_for_default_manifest()
Return type

Optional[str]

should_include_path(path)
Parameters

path (str) –

Return type

bool

property venv: Optional[str]

Job module

A Job is generated by comparing a list of specs with the last known state of the system. Job runs tasks, each of which has a configuration spec that is executed on the running system Each task tracks and records its modifications to the system’s state

class unfurl.job.ConfigChange(parentJob=None, startTime=None, status=None, previousId=None, **kw)

Represents a configuration change made to the system. It has a operating status and a list of dependencies that contribute to its status. There are two kinds of dependencies:

  1. Live resource attributes that the configuration’s inputs depend on.

  2. Other configurations and resources it relies on to function properly.

Parameters
  • parentJob (Optional[Job]) –

  • startTime (Optional[datetime.datetime]) –

  • status (Optional[Union[unfurl.runtime.OperationalInstance, int, str]]) –

  • previousId (Optional[str]) –

  • kw (Any) –

Return type

None

class unfurl.job.Job(manifest, rootResource, jobOptions, previousId=None)

runs ConfigTasks and child Jobs

Parameters
  • manifest (YamlManifest) –

  • rootResource (unfurl.runtime.TopologyInstance) –

  • jobOptions (unfurl.job.JobOptions) –

  • previousId (Optional[str]) –

Return type

None

can_run_task(task, not_ready)

Checked at runtime right before each task is run

  • validate inputs

  • check pre-conditions to see if it can be run

  • check task if it can be run

Parameters
  • task (unfurl.job.ConfigTask) –

  • not_ready (Sequence[unfurl.planrequests.PlanRequest]) –

Return type

Tuple[bool, str]

get_operational_dependencies()

Return an iterator of Operational objects that this instance directly depends on to be operational.

Return type

Iterable[unfurl.job.ConfigTask]

run_task(task, not_ready, depth=0)

During each task run: * Notification of metadata changes that reflect changes made to resources * Notification of add or removing dependency on a resource or properties of a resource * Notification of creation or deletion of a resource * Requests a resource with requested metadata, if it doesn’t exist, a task is run to make it so (e.g. add a dns entry, install a package).

Returns a task.

Parameters
  • task (unfurl.job.ConfigTask) –

  • not_ready (Sequence[unfurl.planrequests.PlanRequest]) –

  • depth (int) –

Return type

unfurl.job.ConfigTask

should_run_task(task)

Checked before rendering the task.

Parameters

task (unfurl.job.ConfigTask) –

Return type

Tuple[bool, str]

class unfurl.job.JobOptions(**kw)

Options available to select which tasks are run, e.g. read-only

Parameters

kw (Any) –

Return type

None

unfurl.job.run_job(manifestPath=None, _opts=None)

Loads the given Ensemble and creates and runs a job.

Parameters
  • manifestPath (str, optional) – If None, it will look for an ensemble in the current working directory.

  • _opts (dict, optional) – the names of the command line options for creating jobs.

Returns

The job that just ran or None if it couldn’t be created.

Return type

(Job)

Init module

This module implements creating and cloning project and ensembles as well Unfurl runtimes.

unfurl.init.clone(source, dest, ensemble_name='ensemble', **options)

Clone the source project or ensemble to dest. If dest isn’t in a project, create a new one.

source can be a git URL or a path inside a local git repository. Git URLs can specify a particular file in the repository using an URL fragment like #<branch_or_tag>:<file/path>. You can use cloudmap url like cloudmap:<package_id>, which will resolve to a git URL.

If source can point to an Unfurl project, an ensemble template, a service template, an existing ensemble, or a folder containing one of those.

The result of the clone depends on the destination:

dest

Result

Inside source project

New or forked ensemble (depending on source)

Missing or empty folder

Clone project, create new ensemble if missing

Another project

See below

Non-empty folder

Error, abort

When creating a new ensemble from a source, if the source points to:

  • an ensemble: fork the ensemble (clone without status and new uri)

  • an ensemble template or TOSCA service template: a create new ensemble from the template.

  • a project: If the project includes a ensemble-template.yaml, use that; if missing, fork the project’s default ensemble.

When dest is set to another project, clone’s behavior depends on source:

If the source is a local file path, the project and local repository is registered in the destination project and a new ensemble is created based on the source.

If the source is a git URL, the repository is cloned inside the destination project. A new ensemble is only created if the source specified a specific ensemble or template or if the source was blueprint project (i.e. it contains an ensemble template but doesn’t contain any ensembles).

When deploying an ensemble that is in project that was cloned into another project, the environment setting in each unfurl.yaml are merged, with the top-level project’s settings taking precedence.

Parameters
  • source (str) –

  • dest (str) –

  • ensemble_name (str) –

  • options (Any) –

Return type

str

Utility classes and functions

class unfurl.logs.sensitive

Base class for marking a value as sensitive. Depending on the context, sensitive values will either be encrypted or redacted when outputed.

exception unfurl.util.UnfurlError(message, saveStack=False, log=False)
Parameters
  • message (object) –

  • saveStack (bool) –

  • log (bool) –

Return type

None

exception unfurl.util.UnfurlTaskError(task, message, log=40, dependency=None)
Parameters
  • task (TaskView) –

  • message (object) –

  • log (int) –

unfurl.util.filter_env(rules, env=None, addOnly=False, sub=None)

Applies the given list of rules to a dictionary of environment variables and returns a new dictionary.

Parameters
  • rules (dict) – A dictionary of rules for adding, removing and filtering environment variables.

  • env (dict, optional) – The environment to apply the give rules to. If env is None it will be set to os.environ. Defaults to None.

  • addOnly (bool, optional) – If addOnly is False (the default) all variables in env will be included in the returned dict, otherwise only variables added by rules will be included

  • sub (Optional[MutableMapping]) –

Return type

Dict[str, str]

Rules applied in the order they are declared in the rules dictionary. The following examples show the different patterns for the rules:

foo: bar

Add foo=bar

+foo

Copy foo from the current environment

+foo: bar

Copy foo, or add foo=bar if it is not present

+foo*

Copy all name from the current environment that match foo*

+!foo*

Copy all name from the current environment except those matching foo*

-!foo

Remove all names except for foo

^foo: /bar/bin

Treat foo like PATH and prepend /bar/bin:$foo

class unfurl.util.sensitive_bytes

Transparent wrapper class to mark bytes as sensitive

decode(*args, **kwargs)

Wrapper method to ensure type conversions maintain sensitive context

Parameters
  • args (List[object]) –

  • kwargs (collections.abc.Mapping) –

Return type

unfurl.util.sensitive_str

class unfurl.util.sensitive_dict

Transparent wrapper class to mark a dict as sensitive

class unfurl.util.sensitive_list(iterable=(), /)

Transparent wrapper class to mark a list as sensitive

class unfurl.util.sensitive_str

Transparent wrapper class to mark a str as sensitive

encode(*args, **kwargs)

Wrapper method to ensure type conversions maintain sensitive context

Parameters
  • args (List[object]) –

  • kwargs (collections.abc.Mapping) –

Return type

unfurl.util.sensitive_bytes

Eval Expression API

API for evaluating Eval Expressions. See also unfurl.configurator.TaskView.query.

class unfurl.eval.Ref(exp, vars=None, trace=None)

A Ref objects describes a path to metadata associated with a resource.

Parameters
  • exp (Union[str, collections.abc.Mapping]) –

  • vars (Optional[dict]) –

  • trace (Optional[int]) –

Return type

None

static is_ref(value)

Return true if the given value looks like a Ref.

Parameters

value (Union[collections.abc.Mapping, unfurl.eval.Ref]) –

Return type

bool

resolve(ctx: RefContext, wantList: Literal[True] = True, strict: Optional[bool] = None) List[Any]
resolve(ctx: RefContext, wantList: Literal[False], strict: Optional[bool] = None) ResolveOneUnion
resolve(ctx: RefContext, wantList: Literal['result'], strict: Optional[bool] = None) List[Result]
resolve(ctx: RefContext, wantList: Union[bool, Literal['result']] = True, strict: Optional[bool] = None) Union[List[Result], List[Any], ResolveOneUnion]

Given an expression, returns a value, a list of values, or or list of Result, depending on wantList:

If wantList=True (default), return a list of matches.

Note that values in the list can be a list or None.

If wantList=False, return resolve_one semantics.

If wantList=’result’, return a list of Result.

resolve_one(ctx, strict=None)

Given an expression, return a value, None or a list of values.

If there is no match, return None.

If there is more than one match, return a list of matches.

Otherwise return the match.

Note: If you want to distinguish between None values and no match or between single match that is a list and a list of matches use Ref.resolve, which always returns a (possible empty) of matches

Parameters
Return type

Union[None, Any, List[Any]]

class unfurl.eval.RefContext(currentResource, vars=None, wantList=False, resolveExternal=False, trace=None, strict=None, task=None)

The context of the expression being evaluated.

Parameters
  • currentResource (ResourceRef) –

  • vars (Optional[dict]) –

  • wantList (Optional[Union[bool, Literal['result']]]) –

  • resolveExternal (bool) –

  • trace (Optional[int]) –

  • strict (Optional[bool]) –

  • task (Optional[TaskView]) –

Return type

None

copy(resource=None, vars=None, wantList=None, trace=0, strict=None, tosca_type=None)

Create a copy of the current RefContext with optional modifications.

Parameters
  • resource (Optional[ResourceRef]) – The resource reference to use for the copy. Defaults to None.

  • vars (Optional[dict]) – A dictionary of variables to update in the copy. Defaults to None.

  • wantList (Optional[Union[bool, Literal["result"]]]) – Determines if a list is desired. Defaults to None.

  • trace (int) – The trace level for the copy. Defaults to 0.

  • strict (Optional[bool]) – Whether to enforce strict mode. Defaults to None.

  • tosca_type (Optional[StatefulEntityType]) – The TOSCA type for the copy. Defaults to None.

Returns

A new instance of RefContext with the specified modifications.

Return type

RefContext

unfurl.eval.map_value(value, resourceOrCxt, applyTemplates=True, as_list=False, flatten=False)

Return a copy of the given string, dict, or list, resolving any expressions or template strings embedded in it.

Parameters
  • value (Any) – The value to be processed, which can be a string, dictionary, or list.

  • resourceOrCxt (Union["RefContext", "ResourceRef"]) – The context or resource instance used for resolving expressions.

  • applyTemplates (bool, optional) – Whether to evaluate Jinja2 templates embedded in strings. Defaults to True.

  • as_list (bool, optional) – Whether to return the result as a list. Defaults to False.

  • flatten (bool) –

Returns

The processed value with resolved expressions or template strings. If as_list is True, the result is always a list.

Return type

Any

Graphql module

This module defines GraphQL representations of ensembles and Unfurl environments, including simplified representations of TOSCA types and node templates as documented by the GraphQL schema below.

These objects are exported as JSON by the export command and by unfurl server API endpoints.

type DeploymentTemplate {
  name: String!
  title: String!
  slug: String!
  description: String
  visibility: String
  metadata: JSON

  blueprint: ApplicationBlueprint!
  primary: ResourceTemplate!
  resourceTemplates: [ResourceTemplate!]
  cloud: ResourceType
  environmentVariableNames: [String!]
  source: String
  branch: String
  commitTime: String
}

type Deployment {
  title: String!
  primary: Resource
  resources: [Resource!]
  deploymentTemplate: DeploymentTemplate!
  url: url
  status: Status
  summary: String
  workflow: String
  deployTime: String
  packages: JSON
}

type ResourceType {
    name: String!
    title: String
    extends: [ResourceType!]
    description: String
    badge: String
    icon: String
    visibility: String
    details_url: String
    inputsSchema: JSON
    computedPropertiesSchema: JSON
    outputsSchema: JSON
    requirements: [RequirementConstraint!]
    implementations: [String]
    implementation_requirements: [String]
    directives: [String]
    metadata: JSON
    _sourceinfo: JSON
}

type RequirementConstraint {
    name: String!
    title: String
    description: String
    resourceType: ResourceType!
    match: ResourceTemplate
    min: Int
    max: Int
    badge: String
    visibility: String
    icon: String
    inputsSchema: Required[JSON]
    requirementsFilter: [RequirementConstraint!]
}

type ResourceTemplate {
    name: String!
    title: String
    type: ResourceType!
    visibility: String
    directives: [String!]
    imported: String
    metadata: JSON

    description: string

    # Maps to an object that conforms to type.inputsSchema
    properties: [Input!]

    dependencies: [Requirement!]
}

type Requirement {
  name: String!
  constraint: RequirementConstraint!
  match: ResourceTemplate
  target: Resource
  visibility: String
}

type Resource {
  name: String!
  title: String!
  url: String
  template: ResourceTemplate!
  status: Status
  state: State
  attributes: [Input!]
  computedProperties: [Input!]
  connections: [Requirement!]
  protected: Boolean
  imported: String
}

type DeploymentEnvironment {
    name: String!
    connections: [ResourceTemplate!]
    instances: [ResourceTemplate!]
    primary_provider: ResourceTemplate
    repositories: JSON!
}

type DeploymentPath {
  name: String!
  environment: String!
  project_id: String
  pipelines: [JSON!]
  incremental_deploy: boolean!
}

type ApplicationBlueprint {
  name: String!
  title: String
  description: String
  primary: ResourceType!
  primaryDeploymentBlueprint: String
  deploymentTemplates: [DeploymentTemplate!]
  projectPath: String
  blueprintPath: String

  livePreview: String
  sourceCodeUrl: String
  image: String
  projectIcon: String
}