Changelog for Unfurl
v1.1.0 - 2024-4-02
Major changes
Ansible configurator
A playbook’s host is now set automatically if not explicitly specified. See https://docs.unfurl.run/configurators.html#playbook-processing for the selection rules.
If
playbook
andinventory
input parameters have a string value, detect whether to treat as a file path or parse to YAML.Fix rendering of
inventory.yml
when theinventory
input parameter is set to inline YAML.If the
ansible_connection
host var is not explicitly set, default to “ssh” if we’re also settingansible_host
to the node’sip_address
.
Update tosca package to v0.0.8
Release includes the following fixes and enhancements:
yaml to python: more idiomatic Python when importing
__init__.yaml
yaml to python: use import namespace when following imports
yaml to python: don’t forward reference built-in types
overwrite policy: don’t overwrite if converted contents didn’t change
remove dependency on unfurl package.
support array and key access in field projections
allow regular data as arguments to boolean expressions.
add
fallback(left: Optional[T], right: T) -> T
tounfurl.tosca_plugins.expr
for type-safe default expressions.move
tfoutput
andtfvar
Options tounfurl.tosca_plugins.expr
(this make them available in safe mode).
Server enhancements and fixes
Add a
/empty_cache
POST endpoint for clearing entire cache (using optionalprefix
parameter). Access requiresUNFURL_SERVER_ADMIN_PROJECT
environment variable to be set and theauth_project
URL parameter to match it.Patch and export now support the “branch” field on DeploymentTemplate
Invalidate cached blueprint export even if file in the key didn’t change.
Other Notable bug fixes
897c32ba testing: add mypy testing api and make lifecycle api more flexible
3f80e5bf job: fix for TaskView.find_connection()
269f981c runtime: refine “is resource computed” heuristic
253f55aa cloudmap: fix tag refs on gitlab hosts
61da0c6f clone: apply revision if found in fragment of the cloned url
63e2f48a kompose: have KomposeInputs use unfurl_datatypes_DockerContainer
03feb7d7 plan: exclude replaced node templates from relationships
0922b557 logging: smarter truncation of log messages with stack traces
aaabb8a2 packages: fix resolving package compatibility
667c59d9 export: stop clearing out requirement match pointing to nested templates
5bfef1d8 export: fix get_nodes_of_type
in match filters
a3c8fb54 export: stop hoisting default templates as requirements.
df2d3e97 parser: fix matching when a requirement target is substituted by the outer topology template.
a3f7feae parser: fix typename checking when evaluating ‘node’ field on requirements
b570246f parser: use the type’s namespace when creating properties.
1b965e5b parser: fix Namespace.get_global_name_and_prefix()
Breaking changes
Drop support for Python 3.7
v1.0.0 - 2024-2-26
Features
We’ve strived to maintain backwards compatibility and API stability for a while now, so for this release we decided to go ahead and christen it 1.0 🎉.
Major new features include:
TOSCA namespaces and global type identifiers
This release adds features designed to enable 3rd party type libraries to be shared within the same TOSCA topology and for Unfurl API consumers (such as Unfurl Cloud) to manage them.
Namespace isolation.
Each imported service template is placed in a separate namespace that is used to resolve type references in the file. It includes the types defined in that file along with the types it imports, with those type names prefixed if namespace_prefix
key is set on the import definition. The namespace will be unique to that import unless an explicit namespace is declared (see below). This can be disabled or limited using the new UNFURL_GLOBAL_NAMESPACE_PACKAGES
environment variable (see Breaking changes below). Note that the Python DSL already behaves this way, as documented here.
Support for TOSCA 1.3’s
namespace
field
If a service template explicitly declares a namespace using the namespace
keyword, its namespace will be assigned that name and namespace isolation will be disabled for any templates it imports – so any import will share the same namespace unless it also declares its own namespace. In addition, any other template that declares the same namespace identifier will be placed in the same namespace. Shared namespaces means a template can reference types it didn’t explicitly import and overwrite existing type definitions with the same name and so declaring namespaces is not recommended.
Globally unique identifiers for types.
Namespaces can be used to generate globally unique type names and updated the Unfurl Server APIs and GraphQl/JSON export format to use these globally unique names.
Follow the format <typename>@<namespace_id>
where namespace_id
is namespace the type was declared in. If a namespace id isn’t explicitly declared using the namespace
keyword, one will be generated using the package id of its repository or current project, and optionally, a file path if it isn’t the root service template.
For example: ContainerComputeHost@unfurl.cloud>/onecommons/std
(a type defined in service-template.yaml
) and EC2Instance@unfurl.cloud>/onecommons/std:aws
(a type defined in aws.yaml
). TOSCA and unfurl types defined in the core vocabulary (and don’t need to be imported) are not qualified. Built-in unfurl types that do need to be imported use unfurl
as there package id, for example: unfurl.nodes.Installer.Terraform@unfurl:tosca_plugins/artifacts
.
Generation of type global names with export and the APIs can be disabled by setting the UNFURL_EXPORT_LOCALNAMES
environment variable (See Breaking changes below).
Cross-package interoperability with type metadata.
A package can declare compatibility with types in different packages without having to import those packages using the aliases
and deprecates
keywords in the metadata section of a type definition. The keywords value can be either a fully qualified type name or a list of fully qualified type names and indicate that the type is equivalent to the listed types. This is used both by the parser and by the API (export includes those types in exported types extends
section) used by Unfurl Cloud’s UI.
Unfurl Server and export APIs
Unfurl Server (and the Unfurl Cloud front-end) patching API now uses global type names to generate import statements with prefixes as needed to prevent clashes packages with the same name.
Support for HTTP caching: Improved etag generation and Cache-Control headers that enable the browser and proxy caches to use stale content while processing slow requests. Use the
CACHE_CONTROL_SERVE_STALE
environment variable to set.Add a Dockerfile that includes a nginx caching proxy in front of unfurl server. Provide prebuilt container images as
onecommons/unfurl:v1.0.0-server-cached
on docker.io and ghcr.io.Improvements to Redis caching: Track dependencies better, cache commit dates for more efficient shallow clones, improved error handling and recovery.
Local developer mode will now serve content from any local repository tracked by the current project or the home project (in
local/unfurl.yaml
). It also improves handling local changes and error recovery.Improve type annotations for new Graphql types and consolidated Graphql Schema.
Add support for a
property_metadata
metadata key to apply metadata to individual properties on a TOSCA datatype.
For example, this property declaration applies the user_settable
metadata key to the environment property on unfurl.datatypes.DockerContainer:
container:
type: unfurl.datatypes.DockerContainer
metadata:
property_metadata:
environment:
user_settable: true
Python DSL
Service templates written in Python now have the following integration points with Unfurl’s runtime:
ToscaType implementations of TOSCA operations can return a Python method instead of an artifact or configurator and that method will converted to a configurator.
If TOSCA operation attempts to execute runtime-only functionality it will be invoked during the job’s render phase instead of converted to YAML.
Introduce a
Computed()
field specifier for TOSCA properties whose value is computed at runtime with the given method.Add a
unfurl.tosca_plugins.functions
module containing utility functions that can be executed in the safe mode Python sandbox. This allows these functions to be executed in “spec” mode (e.g. as part of a class definition or in_class_init
).Add a
unfurl.expr
module containing type-safe python equivalents to Unfurl’s eval expression functions.Add methods and free functions providing type-safes equivalents to Unfurl’s query expressions:
find_configured_by
,find_hosted_on
,find_required_by
andfind_all_required_by
APIs.Providing these apis required synchronizing a job’s task context as a per-thread global runtime state, add public apis for querying global state and retrieving the current
RefContext
.Treat non-required instance properties as optional (return None instead of raising KeyError)
Add a public
unfurl.testing.create_runner()
api for writing unit tests for Python DSL service templates.
Various improvements to YAML-to-Python and Python-to-YAML conversion, including:
Better support for artifacts
Better default operation conversion
Support for aliased references (multiple variables assigned to the same type).
Special case module attributes named
__root__
to generate TOSCAsubstitution_mappings
(akaroot
). For example:
__root__ = my_template
or
__root__ = MyNodeType
DSL API improvements:
Add a
NodeTemplateDirective
string enum for type-safe node template directives.Add a
@anymethod
method decorator for creating methods that can act as both a classmethods and regular method.Revamp Options api for typed and validated metadata on field specifiers.
Add a
DEFAULT
sentinel value to indicate that a field should have a default value constructed from the type annotation. This helps when using forward references to types that aren’t defined yet as well as prevent a bit of DRY.Add a similar
CONSTRAINED
sentinel value to indicate that the property’s value will be set in_class_init
.Add a
unfurl.support.register_custom_constraint()
API for registering custom TOSCA property constraint types.Add UNFURL_TEST_SAFE_LOADER environment variable to force a runtime exception if the tosca loader isn’t in safe mode or to disable safe mode (for testing).UNFURL_TEST_SAFE_LOADER=never option to disable, any other non-empty value to enforce safe mode.
Improve the API documentation.
Release tosca package 0.0.7
Packages
Package version resolution during loading of an ensemble now honors the lock section of the ensemble if present. After an ensemble is deployed, the version tag recorded in the lock section for a package will take precedence.
The lock section will record if a repository was missing version tags, in that case, subsequent deployments will attempt to fetch the first version tag. (As opposed to the default behavior of fetching the latest version tag if one wasn’t explicitly specified.)
Package rules are now serialized in the lock section, following the format used for
UNFURL_PACKAGE_RULES
environment variables. These rules will be applied in future unfurl invocation if theUNFURL_PACKAGE_RULES
environment variable is missing.Package rules are applied more widely, for example when matching repository URLs and to namespace ids.
A “packages” key has be added to Deployment graphql object containing the packages and versions used by the deployment.
Artifact enhancements
Support for the following built-in keywords on artifact definitions have been added as TOSCA extensions:
contents
: Artifacts can now specify their contents inline using the new field. Contents can be auto-generated using eval expression or jinja templates.target
: The node template to deploy the artifact to (default is the node template the artifact is defined in).permissions
: A string field indicating the file permissions to apply when an artifact is deployed as a file.order
: A number field that provides a weight for establishing the order in which declared artifacts are processed.intent
: A string field for declaring the (user-defined) intent for the artifact.Built-in artifact fields will now be evaluated like user-defined properties at runtime (e.g. they can contain eval expressions).
Artifact can now be dynamically created when declared in resultTemplates.
Other Features
Support Python 3.12
Add an
unfurl.testing
module for unit tests.Update the
dashboard
skeleton to use the newonecommons/std
repository.The runtime will add a default relationship of type of
unfurl.relationships.ConnectsTo.ComputeMachines
to the current environment when a primary provider isn’t defined in that environment.Add support for a
validation
key in type metadata to provide user-defined eval expressions for validating properties and custom datatypes (equivalent to thevalidation
key in property metadata).Add
required
andnotnull
options to the “strict” field on “eval” expressions. This provides simple validation of eval expressions, e.g.:
eval: .::foo
strict: required # raise error if no results are found
or
eval: .::foo
strict: notnull # raise error if result isn't found or is null
Breaking changes
Prior to this release all imports shared the same namespace for types. For example, one imported file could reference a type declared in another file if both those files happened to be both imported by another service template.
With this release this behavior is only enabled if the importing template “namespace” key and the imported templates don’t declare a different namespace. Otherwise, each TOSCA import now has its own namespace. Types only have visibility and templates.
If it is a package you can’t modify, you can use
UNFURL_GLOBAL_NAMESPACE_PACKAGES
expects space separated list of namespaces names with wildcards. So UNFURL_GLOBAL_NAMESPACE_PACKAGES='*'
would enable that flag for all service templates.
If missing, defaults to “unfurl.cloud/onecommons/unfurl-types*”
The unfurl server API and the
export
cli command now exports type names as fully qualified. This can be disabled by setting theUNFURL_EXPORT_LOCALNAMES
environment variable. This will environment variable will be removed in a future release.To enable Python 3.12, Unfurl now depends on ansible-core version 2.15.9.
Minor Enhancements and Notable Bug Fixes
job If a node template is created for a “discovered” instance that has a parent, set that parent as the “host” requirement for the template.
job Improve the job summary table and other logging tweaks.
job Report blocked tasks separately from failed jobs (and mark more tasks as blocked).
plan Improve heuristic for detecting circular dependencies and deadlocks (fixes #281)
job don’t silently skip operations when instantiating a configurator class fails.
runtime Fix to assure the reevaluation of computed results when the computation’s dependencies change.
cli Add verbose output to the
status
command if-v
flag is used.cloudmap add –repository cli option
cloudmap add a
save_internal
config setting for Gitlab hosts.job Add heuristic for setting the status on relationship instances when the task’s configurator doesn’t explicitly set it.
job add unfurl install path to the
$PATH
passed to tasks.job By default, dry run jobs print the modified ensemble.yaml to the console instead of saving to disk. Use
UNFURL_SKIP_SAVE=never
to force them to be saved.parser Remove attribute definition if a derived type re-declares it as a property.
parser: Move default node template (those with a “default” directive) completely from inner topology to the root topology.
loader, export: don’t add environment instances or imports to ensemble except when generating json for the
environments
. By default, don’t instantiate environment instances unless they are imported from external ensembles. Similarly, don’t include imports declared in an environment unless they have a namespace_prefix that referenced by a connection in the environment.logging add a
UNFURL_LOG_TRUNCATE
environment variable to change the maximum log message length (default: 748)parser Accept relationship types defined in TOSCA extensions in requirement definitions.
eval have template function’s path parameter make relative paths relative to source location
dns install octodns artifact if needed
ansible load ansible collection artifact as soon as they are installed.
kompose make sure service names should conform to dns host name syntax
kompose fix
expose
input parameterkompose implement the
env
input parameterkompose update artifact version
package wide many improvements to type annotations; for example, introducing TypedDicts and overloaded signatures for the to_label family of functions.
v0.9.1 - 2023-10-25
Features
TOSCA and Python DSL
Add
ToscaInputs
classes for Unfurl’s built-in configurators to enable static type checking of configurator inputs.Introduce the
Options
class to enable typed metadata for TOSCA fields and have the Terraform configurator addtfvar
andtfoutput
options to automatically map properties and attributes to Terraform variables and outputs, respectively.
This example uses the above features to integrate a Terraform module.
from unfurl.configurators.terraform import TerraformConfigurator, TerraformInputs, tfvar, tfoutput
class GenericTerraformManagedResource(tosca.nodes.Root):
example_terraform_var: str = Property(options=tfvar)
example_terraform_output: str = Attribute(options=tfoutput)
@operation(apply_to=["Install.check", "Standard.configure", "Standard.delete"])
def default(self, **kw):
return TerraformConfigurator(TerraformInputs(main="terraform_dir"))
Introduce
unfurl.datatypes.EnvironmentVariables
, a TOSCA datatype that converts to map of environment variables. Subclass this type to enable statically typed environment variables.Allow TOSCA data types to declare a “transform” in metadata that is applied as a property transform.
node_filter
improvements:Recursively merge
requirements
keys in node filters when determining the node_filter for a requirement.Allow
get_nodes_of_type
TOSCA function in node_filtermatch
expressions.
Release 0.0.5 version of the Python tosca package.
Packages
Allow service templates to declare the unfurl version they are compatible with.
They can do this be declaring a repository for the unfurl package like so:
repositories:
unfurl:
url: https://github.com/onecommons/unfurl
revision: v0.9.1
Unfurl will still resolve imports in the unfurl package using the local installed version of unfurl but it will raise an error if it isn’t compatible with the version declared here.
Do semver compatibility check for 0.* versions.
Even though pre 1.0 versions aren’t expected to provide semver guarantees the alternative to doing the semver check is to treat every version as incompatible with another thus requiring every version reference to a package to be updated with each package update. This isn’t very useful, especially when developing against an unstable package.
Support file URLs in package rules.
Minor Enhancements and Notable Bug Fixes
parser allow merge keys to be optional, e.g. “+?/a/d”
loader: Add a
UNFURL_OVERWRITE_POLICY
environment variable to guide the loader’s python to yaml converter.loader: Relax restrictions on
from foo import *
and other bug fixes with the DSL sandbox’s Python import loader.init: apply
UNFURL_SEARCH_ROOT
to unfurl project search.packages: If a package rule specifies a full url, preserve it when applying the rule.
v0.9.0 - Friday 10-13-23
Features
Introduce Python DSL for TOSCA
Write TOSCA as Python modules instead of YAML. Features include:
Static type checking of your TOSCA model.
IDE integration.
Export command now support conversion from YAML to Python and Python to YAML.
Python data model simplifies TOSCA YAML
But allows advanced constraints that encapsulate verbose relationships and node filters.
Python executes in a sandbox to safely parse untrusted TOSCA service templates.
See https://github.com/onecommons/unfurl/blob/main/tosca-package/README.md for more information.
Unfurl Cloud local development mode
You can now see local changes to a blueprint project under development on Unfurl Cloud by running unfurl serve .
in the project directory. If the project was cloned from Unfurl Cloud, it will connect to that local server to render and deploy that local copy of the blueprint (for security, on your local browser only). Use the --cloud-server
option to specify an alternative instance of Unfurl Cloud.
Embedded Blueprints (TOSCA substitution mapping)
Support for TOSCA substitution mapping has been stabilized and integrated into Unfurl Cloud.
Support includes an extension to TOSCA’s requirements mapping that enables you to essentially parameterize an embedded template by letting the outer (the embedding template) substitute node templates in the embedded template.
When substituted node template (in the outer topologies) declares requirements with whose name matches the name of a node template in the substituted (inner) topology then that node template will be replaced by the node template targeted by the requirement.
For example, if the substituted (inner) topology looked like:
node_types:
NestedWithPlaceHolder:
derived_from: tosca:Root
requirements:
- placeholder:
node: PlaceHolderType
topology_template:
substitution_mapping:
node_type: NestedWithPlaceHolder
node_templates:
placeholder:
type: PlaceHolderType
...
Then another topology that is embedding it can replace the “placeholder” template like so:
node_templates:
nested1:
type: NestedWithPlaceHolder
directives:
- substitute
requirements:
- placeholder: replacement
replacement:
type: PlaceHolderType
...
CLI Improvements
Add
--skip-upstream-check
global option which skips pulling latest upstream changes from existing repositories and checking remote repositories for version tags.
Improvements to sub commands:
init Add Azure and Kubernetes project skeletons (
unfurl init --skeleton k8s
andunfurl init --skeleton azure
)clone Add
--design
flag which to configure the cloned project for blueprint development.serve Add
--cloud-server
option to specify Unfurl Cloud instance to connect to.export Add support for exporting TOSCA YAML to Python and Python to YAML; add
--overwrite
and--python-target
options.cloudmap Also allow host urls for options that had only accepted a pre-configured name of a repository host.
Dry Run improvements
When deploying jobs with
--dryrun
if aMock
operation is defined for a node type or template it will be invoked if the configurator doesn’t support dry run mode.The terraform configurator now supports
dryrun_mode
anddryrun_outputs
input parameters.
Runtime Eval Expressions
Eval expressions can now be used in
node_filter
s to query for node matches using the newmatch
keyword in the node filter.Add a “.hosted_on” key to eval expressions that (recursively) follows “.targets” filtered by the
HostedOn
relationship, even across topology boundaries.Add optional
wantList
parameter to the jinja2eval
filter to guarantee the result is a list e.g.{{"expression" | eval(wantList=True)}}
The
trace
keyword in eval expressions now acceptbreak
as a value. This will invoke Python’sbreakpoint()
function when the expression is evaluated.
New Environment Variables
UNFURL_SEARCH_ROOT
environment variable: When search for ensembles and unfurl projects Unfurl will stop when it reach the directory this is set to.UNFURL_SKIP_SAVE
environment variable: If set, skips saving the ensemble.yaml to disk after a job runs,
Minor Enhancements and Notable Bug Fixes
artifacts: Add an
asdf
artifact forkompose
and have the kompose configurator schedule the artifact ifkompose
is missing.parser: treat import failures are fatal errors (abort parsing and validation).
parser: better syntax validation of
+include
statements.cloudmap: allow anonymous connections to Gitlab and Unfurl Cloud api for readonly access
plan: fix spurious validation failures when creating initial environment variable rules
tosca:: Add support for ‘unsupported’, ‘deprecated’, and ‘removed’ property statuses.
tosca: Add support for bitrate scalar units.
tosca: fix built-in storage type definition
plan: fix candidate status check when deleting instances.
server: make sure the generated local unfurl.yaml exists when patching deployments.
packages: warn when remote lookup of version tags fails.
repo: fix matching local paths with repository paths.
loader: fix relative path lookups when processing an +include directives during a TOSCA import of a file in a repository.
logging: improve readability when pretty printing dictionaries.
0.8.0 - 2023-07-26
Breaking Changes
The Unfurl package now only depends on the ansible-core package instead of the full ansible package. Unfurl projects that depend on ansible modules installed by that package will not work with new Unfurl installations unless it is installed by some other means – or, better, declare an Ansible collection artifact as a dependency on the template that requires it. For an example, see this usage in the docker-template.yaml.
Features
Allow Ansible collections to be declared as TOSCA artifacts. Some predefined ones are defined here.
Unfurl Server now tracks the dependent repositories accessed when generating cached representations (e.g. for
/export
) and uses that information to invalidate cache items when the dependencies change.Unfurl Server now caches more git operations, controlled by these environment variables:
CACHE_DEFAULT_PULL_TIMEOUT
(default: 120s) andCACHE_DEFAULT_REMOTE_TAGS_TIMEOUT
(default: 300s)Unfurl Server: add a
/types
endpoint that can extract types from a cloudmap.API: allow simpler Configurator.run() implementations
cloudmap: The cloudmap command now supports
--import local
.eval: Unfurl’s Jinja2 filters that are marked as safe can now be used in safe evaluation mode (currently:
eval
,map_value
,sensitive
, and theto_*_label
family).
Bug Fixes
jobs: Fix issue where some tasks that failed during the render phase were missing from the job summary.
jobs: Don’t apply the
--dryrun
option to the tasks that are installing local artifacts. (Since they will most likely be needed to execute the tasks in the job even in dryrun mode.)k8s: Fix evaluation of the
kubernetes_current_namespace()
expression function outside of a job contextk8s: Filter out data keys with null values when generating a Kubernetes Secret resource.
helm: Fix
check
operation forunfurl.nodes.HelmRepository
helm: If the Kubernetes environment has the insecure flag set, pass
--kube-insecure-skip-tls-verify
to helm.
Misc
Introduce CHANGELOG.md (this file – long overdue!)
CI: container images will be built and pushed to https://github.com/onecommons/unfurl/pkgs/container/unfurl with every git push to CI, regardless of the branch. (In addition to the container images at https://hub.docker.com/r/onecommons/unfurl, which are only built from main.)