Changelog for Unfurl

v0.9.1 - 2023-10-25

Compare with 0.9.0

Features

TOSCA and Python DSL

  • Add ToscaInputs classes for Unfurl’s built-in configurators to enable static type checking of configurator inputs.

  • Introduce the Options class to enable typed metadata for TOSCA fields and have the Terraform configurator add tfvar and tfoutput options to automatically map properties and attributes to Terraform variables and outputs, respectively.

This example uses the above features to integrate a Terraform module.

from unfurl.configurators.terraform import TerraformConfigurator, TerraformInputs, tfvar, tfoutput

class GenericTerraformManagedResource(tosca.nodes.Root):
    example_terraform_var: str = Property(options=tfvar)
    example_terraform_output: str = Attribute(options=tfoutput)

    @operation(apply_to=["Install.check", "Standard.configure", "Standard.delete"])
    def default(self, **kw):
        return TerraformConfigurator(TerraformInputs(main="terraform_dir"))
  • Introduce unfurl.datatypes.EnvironmentVariables, a TOSCA datatype that converts to map of environment variables. Subclass this type to enable statically typed environment variables.

  • Allow TOSCA data types to declare a “transform” in metadata that is applied as a property transform.

  • node_filter improvements:

    • Recursively merge requirements keys in node filters when determining the node_filter for a requirement.

    • Allow get_nodes_of_type TOSCA function in node_filter match expressions.

  • Release 0.0.5 version of the Python tosca package.

Packages

  • Allow service templates to declare the unfurl version they are compatible with.

They can do this be declaring a repository for the unfurl package like so:

        repositories:
          unfurl:
            url: https://github.com/onecommons/unfurl
            revision: v0.9.1

Unfurl will still resolve imports in the unfurl package using the local installed version of unfurl but it will raise an error if it isn’t compatible with the version declared here.

  • Do semver compatibility check for 0.* versions.

Even though pre 1.0 versions aren’t expected to provide semver guarantees the alternative to doing the semver check is to treat every version as incompatible with another thus requiring every version reference to a package to be updated with each package update. This isn’t very useful, especially when developing against an unstable package.

  • Support file URLs in package rules.

Minor Enhancements and Notable Bug Fixes

  • parser allow merge keys to be optional, e.g. “+?/a/d”

  • loader: Add a UNFURL_OVERWRITE_POLICY environment variable to guide the loader’s python to yaml converter.

  • loader: Relax restrictions on from foo import * and other bug fixes with the DSL sandbox’s Python import loader.

  • init: apply UNFURL_SEARCH_ROOT to unfurl project search.

  • packages: If a package rule specifies a full url, preserve it when applying the rule.

v0.9.0 - Friday 10-13-23

Compare with 0.8.0

Features

Introduce Python DSL for TOSCA

Write TOSCA as Python modules instead of YAML. Features include:

  • Static type checking of your TOSCA model.

  • IDE integration.

  • Export command now support conversion from YAML to Python and Python to YAML.

  • Python data model simplifies TOSCA YAML

  • But allows advanced constraints that encapsulate verbose relationships and node filters.

  • Python executes in a sandbox to safely parse untrusted TOSCA service templates.

See https://github.com/onecommons/unfurl/blob/main/tosca-package/README.md for more information.

Unfurl Cloud local development mode

You can now see local changes to a blueprint project under development on Unfurl Cloud by running unfurl serve . in the project directory. If the project was cloned from Unfurl Cloud, it will connect to that local server to render and deploy that local copy of the blueprint (for security, on your local browser only). Use the --cloud-server option to specify an alternative instance of Unfurl Cloud.

Embedded Blueprints (TOSCA substitution mapping)

Support for TOSCA substitution mapping has been stabilized and integrated into Unfurl Cloud.

One new feature level enhancement an extension to TOSCA’s requirements mapping to that enables you to essentially parameterize an embedded template by letting the outer (the embedding template) substitute node templates in the embedded template.

When substituted node template (in the outer topologies) declares requirements with whose name matches the name of a node template in the substituted (inner) topology then that node template will be replaced by the node template targeted by the requirement.

For example, if the substituted (inner) topology looked like:

node_types:
  NestedWithPlaceHolder:
    derived_from: tosca:Root
    requirements:
    - placeholder:
        node: PlaceHolderType

topology_template:
   substitution_mapping:
      node_type: NestedWithPlaceHolder

  node_templates:
      placeholder:
        type: PlaceHolderType
      ...

Then another topology that is embedding it can replace the “placeholder” template like so:

node_templates:  
    nested1:
      type: NestedWithPlaceHolder
      directives:
      - substitute
      requirements:
        - placeholder: replacement

    replacement:
      type: PlaceHolderType
      ...

CLI Improvements

  • Add --skip-upstream-check global option which skips pulling latest upstream changes from existing repositories and checking remote repositories for version tags.

Improvements to sub commands:

  • init Add Azure and Kubernetes project skeletons (unfurl init --skeleton k8s and unfurl init --skeleton azure)

  • clone Add --design flag which to configure the cloned project for blueprint development.

  • serve Add --cloud-server option to specify Unfurl Cloud instance to connect to.

  • export Add support for exporting TOSCA YAML to Python and Python to YAML; add --overwrite and --python-target options.

  • cloudmap Also allow host urls for options that had only accepted a pre-configured name of a repository host.

Dry Run improvements

  • When deploying jobs with --dryrun if a Mock operation is defined for a node type or template it will be invoked if the configurator doesn’t support dry run mode.

  • The terraform configurator now supports dryrun_mode and dryrun_outputs input parameters.

Runtime Eval Expressions

  • Eval expressions can now be used in node_filters to query for node matches using the new match keyword in the node filter.

  • Add a “.hosted_on” key to eval expressions that (recursively) follows “.targets” filtered by the HostedOn relationship, even across topology boundaries.

  • Add optional wantList parameter to the jinja2 eval filter to guarantee the result is a list e.g. {{"expression" | eval(wantList=True)}}

  • The trace keyword in eval expressions now accept break as a value. This will invoke Python’s breakpoint() function when the expression is evaluated.

New Environment Variables

  • UNFURL_SEARCH_ROOT environment variable: When search for ensembles and unfurl projects Unfurl will stop when it reach the directory this is set to.

  • UNFURL_SKIP_SAVE environment variable: If set, skips saving the ensemble.yaml to disk after a job runs,

Minor Enhancements and Notable Bug Fixes

  • artifacts: Add an asdf artifact for kompose and have the kompose configurator schedule the artifact if kompose is missing.

  • parser: treat import failures are fatal errors (abort parsing and validation).

  • parser: better syntax validation of +include statements.

  • cloudmap: allow anonymous connections to Gitlab and Unfurl Cloud api for readonly access

  • plan: fix spurious validation failures when creating initial environment variable rules

  • tosca:: Add support for ‘unsupported’, ‘deprecated’, and ‘removed’ property statuses.

  • tosca: Add support for bitrate scalar units.

  • tosca: fix built-in storage type definition

  • plan: fix candidate status check when deleting instances.

  • server: make sure the generated local unfurl.yaml exists when patching deployments.

  • packages: warn when remote lookup of version tags fails.

  • repo: fix matching local paths with repository paths.

  • loader: fix relative path lookups when processing an +include directives during a TOSCA import of a file in a repository.

  • logging: improve readability when pretty printing dictionaries.

0.8.0 - 2023-07-26

Compare with 0.7.1

Breaking Changes

  • The Unfurl package now only depends on the ansible-core package instead of the full ansible package. Unfurl projects that depend on ansible modules installed by that package will not work with new Unfurl installations unless it is installed by some other means – or, better, declare an Ansible collection artifact as a dependency on the template that requires it. For an example, see this usage in the docker-template.yaml.

Features

  • Allow Ansible collections to be declared as TOSCA artifacts. Some predefined ones are defined here.

  • Unfurl Server now tracks the dependent repositories accessed when generating cached representations (e.g. for /export) and uses that information to invalidate cache items when the dependencies change.

  • Unfurl Server now caches more git operations, controlled by these environment variables: CACHE_DEFAULT_PULL_TIMEOUT (default: 120s) and CACHE_DEFAULT_REMOTE_TAGS_TIMEOUT (default: 300s)

  • Unfurl Server: add a /types endpoint that can extract types from a cloudmap.

  • API: allow simpler Configurator.run() implementations

  • cloudmap: The cloudmap command now supports --import local.

  • eval: Unfurl’s Jinja2 filters that are marked as safe can now be used in safe evaluation mode (currently: eval, map_value, sensitive, and the to_*_label family).

Bug Fixes

  • jobs: Fix issue where some tasks that failed during the render phase were missing from the job summary.

  • jobs: Don’t apply the --dryrun option to the tasks that are installing local artifacts. (Since they will most likely be needed to execute the tasks in the job even in dryrun mode.)

  • k8s: Fix evaluation of the kubernetes_current_namespace() expression function outside of a job context

  • k8s: Filter out data keys with null values when generating a Kubernetes Secret resource.

  • helm: Fix check operation for unfurl.nodes.HelmRepository

  • helm: If the Kubernetes environment has the insecure flag set, pass --kube-insecure-skip-tls-verify to helm.

Misc