Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add Support for HCL Output #225

Closed
straubt1 opened this issue Jul 21, 2020 · 31 comments 路 Fixed by #3365
Closed

[Feature] Add Support for HCL Output #225

straubt1 opened this issue Jul 21, 2020 · 31 comments 路 Fixed by #3365
Assignees
Labels
enhancement New feature or request needs-research size/large estimated < 1 month ux/integration
Milestone

Comments

@straubt1
Copy link

Community Note

  • Please vote on this issue by adding a 馃憤 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Description

Currently the output is JSON, but it would be great if we could output HCL.
While we can use cd cdktf.out && terraform plan, but could see real value/interesting solutions created if the output was HCL.

@straubt1 straubt1 added the enhancement New feature or request label Jul 21, 2020
@skorfmann
Copy link
Contributor

That's interesting, do you have a specific use-case in mind?

@JayDoubleu
Copy link

Programmatically generating TF files and modules for future edits by users is one of potential use cases.

@skorfmann
Copy link
Contributor

Programmatically generating TF files and modules for future edits by users is one of potential use cases.

I was thinking about generating modules as a use case as well. Sort of like having a Construct package which would work with cdktf and has a compatible module output. For this, JSON should be good though.

When thinking about HCL for actual users to continue to work with, I'm wondering if that would make sense for this project. It's very focused on automatically generating JSON configurations which are not optimised for human consumption. It still could in theory be achieved by using something like this with some sprinkles of jq to clean up the JSON a bit.

@straubt1
Copy link
Author

I had similar thoughts to @JayDoubleu here.

json2hcl could be a potential work around, but I really like the idea of native support.

@anubhavmishra
Copy link
Member

Hey @straubt1, this is interesting! For now users can use json2hcl to convert Terraform JSON configuration to HCL. You might run something like cdktf synth --json or cat cdktf.out/cdk.tf.json | json2hcl

@JayDoubleu
Copy link

JayDoubleu commented Jul 21, 2020

I'll even go step further:

  • Ability to spit out pure HCL
  • Ability to fully programmatically run terraform plan, apply and other commands directly from python/typescript (*not just executing terraform binary)

@skorfmann
Copy link
Contributor

  • Ability to fully programmatically run terraform plan, apply and other commands directly from python/typescript (*not just executing terraform binary)

There's the idea of supporting a workflow which is fully backed by the Terraform Cloud, so that you wouldn't need the Terraform CLI locally at all. Technically, that's still a similar concept to what we have right now, though.

@JayDoubleu
Copy link

JayDoubleu commented Jul 22, 2020

If I could run local terraform cli via some sort of API then IMHO it would be useful in CI like Azure DevOps etc. If I could print out to screen using terraform cli default interface but run commands programmatically behind the scenes...

I like terraform cli and HCl from user perspective, it's just a bit pain to orchestrate so I started looking into things like pulumi.

If I could programmatically

  • create actual terraform HCL
  • have an API to interact with terraform cli (not cloud)

This would open wide range of possibilities like templating HCL using typescript/python but with intention for the end users to use HCL.

API to interact with terraform cli is IMO very much needed feature for pipelines and orchestration of entire environments using terraform and external data which would come from other programming languages

@skorfmann
Copy link
Contributor

I like terraform cli and HCl from user perspective, it's just a bit pain to orchestrate

If you're just after Terraform orchestration, have you looked at Terragrunt? For cdktf, there will likely be support for multiple stacks as well (see #35).

If I could programmatically

  • create actual terraform HCL
  • have an API to interact with terraform cli (not cloud)

This would open wide range of possibilities like templating HCL using typescript/python but with intention for the end users to use HCL.

This sounds like a one-off code generator, since I wouldn't know how you'd reconcile changes made by the user with the generated configuration. Are you after a generator similar to create-react-app, but for Terraform HCL projects?

@JayDoubleu
Copy link

JayDoubleu commented Jul 22, 2020

The ability to produce HCL has many use cases, the human readable template generation is just one of them and yes it would look similar to react app generator you mentioned but I'm hoping cdktf is just the right tool for that already . What would make it even better is the ability to spit out HCL. ;)

Regarding the api itself and terragrunt, I'm aware of terragrunt however it's just a wrapper around terraform which can be implemented in any language.

What i'm after is something similar to below:

app = App()
MyStack(app, "automate-everything")
synth = app.synth()
plan = app.plan(synth)

if plan.changes is not None:
    if len(plan.changes.module.custom.fruits) > 4:
       app.apply(synth, target='module.custom.fruits')

@skorfmann
Copy link
Contributor

Regarding the api itself and terragrunt, I'm aware of terragrunt however it's just a wrapper around terraform which can be implemented in any language.

What i'm after is something similar to below:

app = App()
MyStack(app, "automate-everything")
synth = app.synth()
plan = app.plan(synth)

if plan.changes is not None:
    if len(plan.changes.module.custom.fruits) > 4:
       app.apply(synth, target='module.custom.fruits')

That's pretty interesting! Could you create a separate issue for this?

The ability to produce HCL has many use cases, the human readable template generation is just one of them and yes it would look similar to react app generator you mentioned but I'm hoping cdktf is just the right tool for that already . What would make it even better is the ability to spit out HCL. ;)

Cool, thanks - I think I understood the use-case you were mentioning. The one-off generator use-case could also be dedicated issue and reference this issue here.

@marcoferrer
Copy link
Contributor

@skorfmann Just wanted to chime in on this discussion. Supporting HCL as a target output format brings the benefits of allowing sources generated by this project to be better supported by existing tools from the terraform ecosystem.

For example, these are a few of the limitations we've experienced when trying to adopt this tool.

  • IntelliJ HCL plugin does not support resolving definitions from tf.json sources. This normally wouldn't be a problem if the entire module was defined in json. In our case, we actually plan to distribute common cdk modules to engineers to help them conform to an archetype / interface. They then implement their application-specific tf definitions in HCL within the same directory.
  • The SCA tool checkov also seems to lack support for scanning tf.json sources

Currently, we have a convoluted workaround where we strip out the // metadata keys from the cdk output, run it through json2hcl and then finally run terraform 0.12upgrade since the syntax generated by json2hcl is dated and produces errors in our IDE. Although it works we don't feel like its a sustainable option long term. Native support for HCL would be preferable.

@skorfmann
Copy link
Contributor

@marcoferrer thanks for your input! Integration into the broader ecosystem of tooling for Terraform HCL is a very good point indeed. So far I was assuming that the JSON representation of HCL would just work in most cases, haven't looked at it in detail though.

IntelliJ HCL plugin does not support resolving definitions from tf.json sources. This normally wouldn't be a problem if the entire module was defined in json.

What you're saying is, that either pure JSON or pure HCL is supported by the extension, but a mix of both is not?

In our case, we actually plan to distribute common cdk modules to engineers to help them conform to an archetype / interface. They then implement their application-specific tf definitions in HCL within the same directory.

I'm very interested in more details about your intended workflow. By cdk module you mean an actual Terraform module containing the JSON output of cdktf?

The SCA tool checkov also seems to lack support for scanning tf.json sources

Haven't tested it, but judging from the code it should be supported.

Currently, we have a convoluted workaround where we strip out the // metadata keys from the cdk output, run it through json2hcl

json2hcl doesn't deal with // nicely or what's the reason to strip it out?

and then finally run terraform 0.12upgrade since the syntax generated by json2hcl is dated and produces errors in our IDE. Although it works we don't feel like its a sustainable option long term.

That's essentially your release process for the packages / modules you're planning to distribute?

@marcoferrer
Copy link
Contributor

marcoferrer commented Jan 19, 2021

What you're saying is, that either pure JSON or pure HCL is supported by the extension, but a mix of both is not?

It's actually a little worse than that. Using tf.json sources is completely unsupported by the IDE plugin. Of course, this has no effect on the validity of the sources or the ability to use them with the terraform cli, but it does negatively impact the module development experience.

I'm very interested in more details about your intended workflow. By cdk module you mean an actual Terraform module containing the JSON output of cdktf?

We've defined standards within our organization for terraform modules for specific types of applications. These standards define at a minimum what providers / alias are expected as well as a predefined set of variables and outputs. This forms a pseudo-interface we expect each module to conform to. Ensuring modules are conforming to this interface allows us to build org specific automation and policy governance around our infrastructure lifecycle.

Since terraform isn't exactly friendly to reuse of variables, outputs, and providers we were planning on defining cdk modules in Java and publishing them as a jar into an internal artifact repository. Then using maven/gradle engineers are able to pull a semantically versioned modules and instantiate it from within their local cdk module sources. This prevents the need to continually copy/paste specific files throughout all the projects.

I can supply an example project to further demonstrate if you'd like.

Haven't tested it, but judging from the code it should be supported.

That's odd, I tested it locally with one of our projects and ran into issues. Ill give it another shot to see if there was something I missed.

json2hcl doesn't deal with // nicely or what's the reason to strip it out?

Include the // when running json2hcl results in invalid hcl definitions. Most cases the // key gets converted into an hcl keyword. Since json2hcl is no longer being actively developed its output is in HCL1 if I recall correctly. Cause of this, the IDE ends up displaying various errors and once again autocomplete is rendered useless in other non-generated sources.

Its seems that some tools arent happy with the // keys in the output and complain about it being an invalid definition. If this is truly part of the tf.json spec then I would assume its on the individual tool maintainers to fix any issues. If not, there is the option of potentially adding a feature flag to cdktf to omit these keys / metadata from the output.

That's essentially your release process for the packages / modules you're planning to distribute?

In a nutshell yes. Its purely to maintain the developer experience and allow the IDE to resolve definitions created by cdktf from other files being authored by engineers. The terraform cli also produces errors when trying to read hcl that includes the converted // keys.

@jsteinich
Copy link
Collaborator

@marcoferrer out of curiosity, do you plan to have other teams eventually transition to writing their configuration using Java rather than hcl?

@marcoferrer
Copy link
Contributor

marcoferrer commented Jan 19, 2021

@jsteinich Currently, that seems to be the long-term goal but we do want to be able to support cases where we might have to supplement a module with manually written HCL. The move to Java is to allow our engineers to leverage their existing experience and dependency management tools so that they are able to focus purely on terraform concepts without too much overhead of learning hcl specifics or code reuse via copy/paste.

@skorfmann
Copy link
Contributor

I can supply an example project to further demonstrate if you'd like.

Yes, an example would be great!

Include the // when running json2hcl results in invalid hcl definitions. Most cases the // key gets converted into an hcl keyword. Since json2hcl is no longer being actively developed its output is in HCL1 if I recall correctly. Cause of this, the IDE ends up displaying various errors and once again autocomplete is rendered useless in other non-generated sources.

That's certainly not ideal. Perhaps it would make sense to fork and fix json2hcl for now, but don't know the effort required to do so. I'll check if there's an easy option to do this.

Its seems that some tools arent happy with the // keys in the output and complain about it being an invalid definition. If this is truly part of the tf.json spec then I would assume its on the individual tool maintainers to fix any issues. If not, there is the option of potentially adding a feature flag to cdktf to omit these keys / metadata from the output.

The purpose of // is just metadata at the moment. I could imagine that we add an option to not render the metadata at all, which would potentially lead to a degraded experience around debugging in error handling when we start to implement these topics (right now there wouldn't be a difference). It should be noted though, that the // are part of the JSON syntax of Terraform. In the long run, fixing the tools to deal with // would be certainly better :)

@marcoferrer
Copy link
Contributor

Here's a distilled example of using cdk in a java project using gradle. It also demonstrates sharing common configuration between multiple modules as well as mixing HCL and json sources. #510

That's certainly not ideal. Perhaps it would make sense to fork and fix json2hcl for now, but don't know the effort required to do so. I'll check if there's an easy option to do this.

After looking at the implementation of json2hcl it looks like it relies on the first party hcl libraries published by hashicorp. It seems to be a simple light wrapper. I can try forking the project and bumping the dependencies to see if that solves the issues Im seeing since its still using the hcl lib from 2016.

The purpose of // is just metadata at the moment. I could imagine that we add an option to not render the metadata at all, which would potentially lead to a degraded experience around debugging in error handling when we start to implement these topics (right now there wouldn't be a difference). It should be noted though, that the // are part of the JSON syntax of Terraform. In the long run, fixing the tools to deal with // would be certainly better :)

This is good to know and I agree. As long as the json syntax spec for terraform is being adhered to any issues with // should be handled by the tool developers.

@Satak
Copy link

Satak commented Mar 20, 2021

Programmatically generating TF files and modules for future edits by users is one of potential use cases.

Yes this is exactly what we would do if this was supported. For example ServiceNow Cloud management only supports HCL files, not JSON. Native HCL output would be expected from this tool.

@mark-e-kibbe
Copy link

For those of us that have implemented automation with the Terraform Api's for Cloud & Enterprise, would this issue be relevant to parsing a java object out to HCL format?

For example, by taking a Map<String, someJavaModel> and doing a hclParse to obtain the string to utilize in the Terraform Api's?

Example Use Case:
A complex type is required to run some Terraform. This type is a map of objects, enabling the terraform to utilize said collection of objects to use in foreach.

Current Scenario:
Manually having to parse Terraform Complex Types to HCL string on a use-case basis, versus being able to parse to HCL string.

I came across this issue in researching better ways to handle this use case currently.

@DanielMSchmidt DanielMSchmidt added needs-priority Issue has not yet been prioritized; this will prompt team review size/large estimated < 1 month ux/integration labels Dec 7, 2021
@DanielMSchmidt DanielMSchmidt added needs-research and removed needs-priority Issue has not yet been prioritized; this will prompt team review labels Feb 17, 2022
@nitrocode
Copy link

nitrocode commented Oct 29, 2022

There are a number of json2hcl projects and it would help to specify which one. The one by kvz looks maintained (changes from this year) and the issue with the parsing of the double slashes hasnt been created yet.

https://github.com/kvz/json2hcl

Edit: I haven't tried this tool but I did see the issue where hcl2 is still unsupported.

@fdervisi
Copy link

I came across this topic related to converting Terraform CDK code to HashiCorp Configuration Language (HCL) format. I was wondering if there has been any progress in this area, specifically in terms of a built-in method to accomplish this task.

If this feature is not currently available, I would appreciate any information on the best programmatic approach to achieve this conversion. From my understanding, it can be done manually using Terraform's jsonencode and hcl2_to_json commands, but I am open to other methods as well.

@jsteinich
Copy link
Collaborator

I came across this topic related to converting Terraform CDK code to HashiCorp Configuration Language (HCL) format. I was wondering if there has been any progress in this area, specifically in terms of a built-in method to accomplish this task.

If this feature is not currently available, I would appreciate any information on the best programmatic approach to achieve this conversion. From my understanding, it can be done manually using Terraform's jsonencode and hcl2_to_json commands, but I am open to other methods as well.

I don't believe this area has been worked on and I'm not sure if there are any actively maintained tools for doing so either.
I'm curious what your intended use case is.

@fdervisi
Copy link

For my project, I have developed a comprehensive Terraform CDK Cloud Deployment framework. However, in comparison to Terraform HCL, I found Terraform CDK to be more complex in terms of setup and operations. To simplify the process, my goal is to convert Terraform CDK into Terraform HCL.

Alternatively, I have considered deploying Terraform CDK into a container and passing a YAML file to it. Unfortunately, I have been unable to find a way to run Terraform CDK in a container. As an end result, I am actively seeking a straightforward solution for the end-user of my tool.

@srgustafson8
Copy link

I'm curious what your intended use case is.

Not the OP, but our use case is to help with debugging. My team personally doesn't use CDKTF (we use pure HCL) but we have some dev teams in the organisation who are adopting CDKTF. When they have an issue with infrastructure, it would be faster and easier to determine if it's an infra, architecture or code issue if we could examine the 'HCL' that they are trying to deploy.

@JayDoubleu

This comment was marked as off-topic.

@moali87
Copy link

moali87 commented Apr 13, 2023

Another use case: We use Snyk to scan our infrastructure code. Snyk can scan both CDK and TF but not json's of CDKTF, so building the hcl file would be beneficial for us here.

@wstrydom-ebsco
Copy link

There's a couple more reasons for exporting Terraform native files:

  1. The audience maintaining and auditing files are not familiar with general purpose languages and work in Terraform and CloudFormation on a daily basis. The division of labor is that folks like me, write a generic Terraform modules, and code generators (since not all algorithms can be expressed in Terraform), while operations and infrastructure folks "glue" them together.

  2. Replacing custom code generators to CDKTF. Examples of code generation involves working across accounts, regions and resources, this often involves permutations, and logic based on metadata (for example, if it's a devtest network, peer it to the devtest transit gateway in the given region; the devtest transit gateway is in a different account). While generating modules may be beneficial, some of these permutations are applicable at the root module, like creating the providers, and regional resources in AWS, and then connecting them for a specific purpose, whether it's transit gateways in a shared networking account, or configuring a multi-regional system that's active-active.

  3. Migration to CDKTF. Consider a rather complex Terraform with tens of thousands of lines of code, built by folks who didn't know general purpose languages (so there's no code generation). There's a ton of scaffolding to cater for regions, environments, accounts and whatnot. We'd like to slowly replace those with CDKTF (since early tests show great potential). However, to ease migration, we'd start off by moving only a few resources from HCL to CDKTF. The desired scenario would be that the resources generated by CDKTF can replace existing Terraform files. For example, we may have an S3 bucket with some complex configuration. Instead of using a module (which requires us to update 200+ projects), we'd replace it with a "logging_bucket" CDKTF stack, and have the stack generate the resources as if we wrote them. So instead of maintaining 200+ projects that create a logging bucket, we only need to maintain a single CDKTF stack, and inject the code into a project. This is valuable even if the S3 bucket was implemented as a module.

The tooling that was suggested up to now provide a less than ideal experience for the these scenarios. It does work well for say deploying a web application, but not for platform engineering where Terraform is widely adopted.

@oshea00
Copy link

oshea00 commented Jul 29, 2023

We have two camps, a group (A) that maintains the AWS environments using hand-rolled Terraform modules and shared state in DynamoDB backend, and the developers group (B) who need to create serverless components and code who would prefer to use CDKTF to generate components which can require hand-rolling literally thousands of lines of HCL.

Our pipelines are required to run pre-deployment scans on the HCL (Bridgecrew/Checkov or Kics) and applications will not deploy if they can't comply with those guardrails. So HCL output would be a useful way to bridge the gap between the groups.

We are already seeing reluctance to keep current on AWS Provider and Terraform versions due to breaking changes that increasingly require cut-n-paste updates across hundreds of terraform files and the risk of corrupting shared state due to a newer breaking change.

cat repos/learn-cdktf/terraform/stacks/learn-cdktf/cdk.tf.json | jq 'walk(if type == "object" then del(.["//"]) else . end)' | ~/json2hcl

If using jq 1.5, add this code to your ~/.jq file:

# Apply f to composite entities recursively, and to atoms
def walk(f):
  . as $in
  | if type == "object" then
      reduce keys_unsorted[] as $key
        ( {}; . + { ($key):  ($in[$key] | walk(f)) } ) | f
  elif type == "array" then map( walk(f) ) | f
  else f
  end;

@nicchongwb
Copy link

I came across article https://nklya.medium.com/how-to-write-hcl2-from-python-53ac12e45874

For my case, I wanted to generate hcl in python. The article suggest that we can use hclwrite from hcl golang. Seems to be a more stable approach however definitely more tedious.

@mutahhir mutahhir self-assigned this Nov 24, 2023
@xiehan xiehan added this to the 0.20 (tentative) milestone Dec 5, 2023
@mutahhir mutahhir mentioned this issue Dec 18, 2023
5 tasks
Copy link
Contributor

github-actions bot commented Feb 9, 2024

I'm going to lock this issue because it has been closed for 30 days. This helps our maintainers find and focus on the active issues. If you've found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Feb 9, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request needs-research size/large estimated < 1 month ux/integration
Projects
None yet
Development

Successfully merging a pull request may close this issue.