Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data Sources cannot use local values #11011

Open
KOConchobhair opened this issue May 11, 2021 · 12 comments
Open

Data Sources cannot use local values #11011

KOConchobhair opened this issue May 11, 2021 · 12 comments
Assignees
Labels
bug hcl2-dag hcl2 sync to jira For issues that need to be imported to Packer internal JIRA backlog

Comments

@KOConchobhair
Copy link

KOConchobhair commented May 11, 2021

Overview of the Issue

Trying to use local values in a data source configuration and getting errors from packer validate.

Reproduction Steps

packer init foo.pkr.hcl
packer validate foo.pkr.hcl

Packer version

Packer v1.7.2

Simplified Packer Buildfile

packer {
  required_version = ">= 1.7.0"

  required_plugins {
    amazon = {
      version = ">= 0.0.1"
      source  = "github.com/hashicorp/amazon"
    }
  }
}

locals {
  assume_role_arn      = "arn:aws:iam::1234567890:role/allow-access-from-other-accounts"
  assume_role_session  = "packer"
  assume_role_duration = 3600
}

data "amazon-ami" "ubuntu_ami" {
  assume_role {
    role_arn         = local.assume_role_arn
    session_name     = local.assume_role_session
    duration_seconds = local.assume_role_duration
  }

  filters = {
    name                = "ubuntu/images/*ubuntu-bionic-18.04-amd64-server-*"
    root-device-type    = "ebs"
    virtualization-type = "hvm"
  }
  most_recent = true
  owners      = ["099720109477"] # This is the ID for Canonical
  region      = "us-east-1"
}

Operating system and Environment details

Linux KOCONNOR 4.4.0-17763-Microsoft #1432-Microsoft Mon Aug 18 18:18:00 PST 2020 x86_64 x86_64 x86_64 GNU/Linux
(WSL in Windows 10)

Log Fragments and crash.log files

2021/05/11 10:53:12 [INFO] Packer version: 1.7.2 [go1.16.3 linux amd64]
2021/05/11 10:53:12 [TRACE] discovering plugins in /usr/local/bin
2021/05/11 10:53:12 [TRACE] discovering plugins in /home/koconnor/.packer.d/plugins
2021/05/11 10:53:12 [TRACE] discovering plugins in .
2021/05/11 10:53:12 [INFO] PACKER_CONFIG env var not set; checking the default config file path
2021/05/11 10:53:12 [INFO] PACKER_CONFIG env var set; attempting to open config file: /home/koconnor/.packerconfig
2021/05/11 10:53:12 [WARN] Config file doesn't exist: /home/koconnor/.packerconfig
2021/05/11 10:53:12 [INFO] Setting cache directory: /mnt/c/Users/koconnor/Work/git/princetonidentity/identity-server-cloud/identity-server/packer_cache
e: Running in background, not using a TTY
2021/05/11 10:53:12 [TRACE] listing potential installations for "github.com/hashicorp/amazon" that match ">= 0.0.1". plugingetter.ListInstallationsOptions{FromFolders:[]string{"/usr/local/bin/packer", ".", "/home/koconnor/.packer.d/plugins"}, BinaryInstallationOptions:plugingetter.BinaryInstallationOptions{APIVersionMajor:"5", APIVersionMinor:"0", OS:"linux", ARCH:"amd64", Ext:"", Checksummers:[]plugingetter.Checksummer{plugingetter.Checksummer{Type:"sha256", Hash:(*sha256.digest)(0xc000296880)}}}}
2021/05/11 10:53:12 [TRACE] Found the following "github.com/hashicorp/amazon" installations: [{/home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 v0.0.1}]
2021/05/11 10:53:12 [INFO] found external [chroot ebs ebssurrogate ebsvolume instance] builders from amazon plugin
2021/05/11 10:53:12 [INFO] found external [import] post-processors from amazon plugin
2021/05/11 10:53:12 found external [ami secretsmanager] datasource from amazon plugin
2021/05/11 10:53:12 [TRACE] Starting external plugin /home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 start datasource ami
2021/05/11 10:53:12 Starting plugin: /home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 []string{"/home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64", "start", "datasource", "ami"}
2021/05/11 10:53:12 Waiting for RPC address for: /home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64
2021/05/11 10:53:12 packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 plugin: 2021/05/11 10:53:12 Plugin address: unix /tmp/packer-plugin458151726
2021/05/11 10:53:12 packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 plugin: 2021/05/11 10:53:12 Waiting for connection...
2021/05/11 10:53:12 Received unix RPC address for /home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64: addr is /tmp/packer-plugin458151726
2021/05/11 10:53:12 packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 plugin: 2021/05/11 10:53:12 Serving a plugin connection...
2021/05/11 10:53:12 packer-plugin-amazon_v0.0.1_x5.0_linux_amd64 plugin: 2021/05/11 10:53:12 [TRACE] starting datasource ami

Error: Unsupported attribute
  on foo.pkr.hcl line 20:

  (source code not available)
  on foo.pkr.hcl line 20:

  (source code not available)
This object does not have an attribute named "assume_role_arn".


This object does not have an attribute named "assume_role_arn".
Error: Unsupported attribute


Error: Unsupported attribute
  on foo.pkr.hcl line 22:

  (source code not available)
  on foo.pkr.hcl line 22:

  (source code not available)
This object does not have an attribute named "assume_role_duration".


This object does not have an attribute named "assume_role_duration".
Error: Unsupported attribute


Error: Unsupported attribute
  on foo.pkr.hcl line 21:

  (source code not available)
  on foo.pkr.hcl line 21:

  (source code not available)
This object does not have an attribute named "assume_role_session".


2021/05/11 10:53:12 [INFO] (telemetry) Finalizing.
This object does not have an attribute named "assume_role_session".


2021/05/11 10:53:12 waiting for all plugin processes to complete...
2021/05/11 10:53:12 /home/koconnor/.packer.d/plugins/github.com/hashicorp/amazon/packer-plugin-amazon_v0.0.1_x5.0_linux_amd64: plugin process exited
@nywilken
Copy link
Contributor

Hi @KOConchobhair thanks for reaching out. I believe this has to do with the fact that Packer tries to resolve datasources before locals, as locals can contain the output of a datasource as its value. Which essentially means that locals, at this time, can't be used as the arguments for a datasource because Packer does not resolved the "local" attributes yet. I could be mistaken here so I'll bubble this up to the team for some additional input.

But as a workaround, if you are stuck, you can use variables instead of locals here since all of the values seem to be hard coded. Locals are great for when you have a variable whose value is some sort of expression (e.g result of datasource, external function call, or manipulation of variable data).

@nywilken nywilken added the hcl2 label May 11, 2021
@azr
Copy link
Contributor

azr commented May 12, 2021

This will be solved when we introduce a dependency graph to Packer HCL2. But we're shy on doing that while JSON is still inside Packer core. This will take some time.

@azr azr added the hcl2-dag label May 12, 2021
@KOConchobhair
Copy link
Author

Okay, that all makes sense. I was just surprised I was the first to encounter this so I thought it might have been something I was doing wrong haha. I tend to use locals more like constants in HCL but I can certainly use variables here as a workaround.

@DenisBY
Copy link

DenisBY commented Oct 27, 2021

I can be wrong but the purpose of 'local' is to build a new variable out of existing variable(s). If you assign just one value to the variable it's better to use 'variable' with a default value.

@project0
Copy link

project0 commented Feb 8, 2022

Just stumbled across this today, i wonder what makes it difficult in packer to fix this as it works on terraform?

@nywilken
Copy link
Contributor

nywilken commented Feb 9, 2022

Just stumbled across this today, i wonder what makes it difficult in packer to fix this as it works on terraform?

@project0 great question. It boils down to ordering and dependency handling between blocks in Packer. Terraform has the resource graph which manages dependencies and can determine if block a depends on block b that it should first evaluate block b before trying to evaluate block a.

Packer on the other hand doesn't have a graph, yet, so it has to manually enforce order. In this case that datasources must be evaluated before locals in order for their values to be used in locals. We plan to introduce a graph in a later release of Packer, which is why this issue has been assigned the hcl2-dag label. I hope this explanation clarifies things a bit.

@mschuchard
Copy link

mschuchard commented Sep 1, 2022

Just curious how this looks on the current roadmap. Use of locals within data is much closer to best practices than use of data attributes within locals, and without the external data source option we need locals for external data as a prerequisite to Read data.

I know I could theoretically develop a custom data that supports external data similar to TF, but I am a bit wary after being one of the small number of Packer plugin third party developers and finding information gaps that I had to assemble between various other sources.

Thanks!

@nirvana-msu
Copy link

nirvana-msu commented Sep 22, 2022

For anyone looking for a (partial) workaround - while we cannot yet use locals within a data source, we can use other data sources. What's even better, there is a null data source (undocumented?) which takes an arbitrary expression that produces a string. So you can do things like:

data "null" "basic-auth" {
  input = join(" ", [
    "Basic", base64encode(join(":", ["username", vault("kv/data/secrets", "password")]))
  ])
}

data "http" "call-rest-api" {
  url = "http://host:port/some/rest/api"

  request_headers = {
    Accept        = "application/json"
    Authorization = data.null.basic-auth.output
  }
}

i.e. you could re-use that computed null data source value just like you'd re-use a local.

@mschuchard
Copy link

mschuchard commented Sep 26, 2022

Thanks for the cool workaround on the feature gap with the undocumented feature. There is an "optional provisioner" feature I worked out also using undocumented Packer features, so this kind of solution is not a rare implementation design. Now we have closer to best practices, and the code in data blocks does not have to be an unreadable mess.

The above trick is also probably going to be a long-term solution for this also because of the reverse dependency decision, and the effort to support both would be a significant development effort that is currently stalled. Thanks again!

Edit: Also thanks to Megan and Adrien for the null data!

@nywilken nywilken added the sync to jira For issues that need to be imported to Packer internal JIRA backlog label Sep 30, 2022
@github-actions
Copy link

This issue has been synced to JIRA for planning.

JIRA ID: HPR-755

@hegerdes
Copy link
Contributor

I want to bump this up again. It's still a problem in 2024!

I also just tried to pass a computed url to the http data-source and got the error below and got totally confused since its there. We need to be able to have computed inputs for data-sources and it's wired that this shortcoming is not documented anywhere while the error message is not very helpful.

Is there a update for the roadmap supporting this.

Error: Unsupported attribute

  on talos.pkr.hcl line 33:
  (source code not available)

This object does not have an attribute named "talos_download_url".

Thanks for the workaround @nirvana-msu

@kwohlfahrt
Copy link

I think this is now resolved, possibly by this feature in 1.12.0:

core: add support for a DAG-based evaluation on locals and datasources.

At least, I'm able to use a local variable in my data-source now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug hcl2-dag hcl2 sync to jira For issues that need to be imported to Packer internal JIRA backlog
Projects
None yet
Development

No branches or pull requests

9 participants