diff --git a/README.md b/README.md index 201f265..0882475 100644 --- a/README.md +++ b/README.md @@ -1,566 +1,565 @@ # Puppet Archive [![License](https://img.shields.io/github/license/voxpupuli/puppet-archive.svg)](https://github.com/voxpupuli/puppet-archive/blob/master/LICENSE) [![Build Status](https://travis-ci.org/voxpupuli/puppet-archive.png?branch=master)](https://travis-ci.org/voxpupuli/puppet-archive) [![Code Coverage](https://coveralls.io/repos/github/voxpupuli/puppet-archive/badge.svg?branch=master)](https://coveralls.io/github/voxpupuli/puppet-archive) [![Puppet Forge](https://img.shields.io/puppetforge/v/puppet/archive.svg)](https://forge.puppetlabs.com/puppet/archive) [![Puppet Forge - downloads](https://img.shields.io/puppetforge/dt/puppet/archive.svg)](https://forge.puppetlabs.com/puppet/archive) [![Puppet Forge - endorsement](https://img.shields.io/puppetforge/e/puppet/archive.svg)](https://forge.puppetlabs.com/puppet/archive) [![Puppet Forge - scores](https://img.shields.io/puppetforge/f/puppet/archive.svg)](https://forge.puppetlabs.com/puppet/archive) [![Camptocamp compatible](https://img.shields.io/badge/camptocamp-compatible-orange.svg)](https://forge.puppet.com/camptocamp/archive) ## Table of Contents 1. [Overview](#overview) 1. [Module Description](#module-description) 1. [Setup](#setup) 1. [Usage](#usage) * [Example](#usage-example) * [Puppet URL](#puppet-url) * [File permission](#file-permission) * [Network files](#network-files) * [Extract customization](#extract-customization) * [S3 Bucket](#s3-bucket) * [GS Bucket](#gs-bucket) * [Migrating from puppet-staging](#migrating-from-puppet-staging) 1. [Reference](#reference) 1. [Development](#development) ## Overview This module manages download, deployment, and cleanup of archive files. ## Module Description This module uses types and providers to download and manage compress files, with optional lifecycle functionality such as checksum, extraction, and cleanup. The benefits over existing modules such as [puppet-staging](https://github.com/voxpupuli/puppet-staging): * Implemented via types and provider instead of exec resource. * Follows 302 redirect and propagate download failure. * Optional checksum verification of archive files. * Automatic dependency to parent directory. * Support Windows file extraction via 7zip or PowerShell (Zip file only). * Able to cleanup archive files after extraction. This module is compatible with [camptocamp/archive](https://forge.puppet.com/camptocamp/archive). For this it provides compatibility shims. ## Setup On Windows 7zip is required to extract all archives except zip files which will be extracted with PowerShell if 7zip is not available (requires `System.IO.Compression.FileSystem`/Windows 2012+). Windows clients can install 7zip via `include 'archive'`. On posix systems, curl is the default provider. The default provider can be overwritten by configuring resource defaults in site.pp: ```puppet Archive { provider => 'ruby', } ``` Users of the module are responsible for archive package dependencies, for alternative providers and all extraction utilities such as tar, gunzip, bunzip: ```puppet if $facts['osfamily'] != 'windows' { package { 'wget': ensure => present, } package { 'bunzip': ensure => present, } Archive { provider => 'wget', require => Package['wget', 'bunzip'], } } ``` ## Usage Archive module dependencies are managed by the `archive` class. This is only required on Windows. By default 7zip is installed via chocolatey, but the MSI package can be installed instead: ```puppet class { 'archive': seven_zip_name => '7-Zip 9.20 (x64 edition)', seven_zip_source => 'C:/Windows/Temp/7z920-x64.msi', seven_zip_provider => 'windows', } ``` To automatically load archives as part of this class you can define the `archives` parameter. ```puppet class { 'archive': archives => { '/tmp/jta-1.1.jar' => { 'ensure' => 'present', 'source' => 'http://central.maven.org/maven2/javax/transaction/jta/1.1/jta-1.1.jar', }, } } ``` ### Usage Example Simple example that downloads from web server: ```puppet archive { '/tmp/vagrant.deb': ensure => present, source => 'https://releases.hashicorp.com/vagrant/2.2.3/vagrant_2.2.3_x86_64.deb', user => 0, group => 0, } ``` More complex example : ```puppet include 'archive' # NOTE: optional for posix platforms archive { '/tmp/jta-1.1.jar': ensure => present, extract => true, extract_path => '/tmp', source => 'http://central.maven.org/maven2/javax/transaction/jta/1.1/jta-1.1.jar', checksum => '2ca09f0b36ca7d71b762e14ea2ff09d5eac57558', checksum_type => 'sha1', creates => '/tmp/javax', cleanup => true, } archive { '/tmp/test100k.db': source => 'ftp://ftp.otenet.gr/test100k.db', username => 'speedtest', password => 'speedtest', } ``` If you want to extract a `.tar.gz` file: ```puppet $install_path = '/opt/wso2' $package_name = 'wso2esb' $package_ensure = '4.9.0' $repository_url = 'http://company.com/repository/wso2' $archive_name = "${package_name}-${package_ensure}.tgz" $wso2_package_source = "${repository_url}/${archive_name}" archive { $archive_name: path => "/tmp/${archive_name}", source => $wso2_package_source, extract => true, extract_path => $install_path, creates => "${install_path}/${package_name}-${package_ensure}", cleanup => true, require => File['wso2_appdir'], } ``` ### Puppet URL Since march 2017, the Archive type also supports puppet URLs. Here is an example of how to use this: ```puppet archive { '/home/myuser/help': source => 'puppet:///modules/profile/help.tar.gz', extract => true, extract_path => $homedir, creates => "${homedir}/help" #directory inside tgz } ``` ### File permission When extracting files as non-root user, either ensure the target directory exists with the appropriate permission (see [tomcat.pp](tests/tomcat.pp) for full working example): ```puppet $dirname = 'apache-tomcat-9.0.0.M3' $filename = "${dirname}.zip" $install_path = "/opt/${dirname}" file { $install_path: ensure => directory, owner => 'tomcat', group => 'tomcat', mode => '0755', } archive { $filename: path => "/tmp/${filename}", source => 'http://www-eu.apache.org/dist/tomcat/tomcat-9/v9.0.0.M3/bin/apache-tomcat-9.0.0.M3.zip', checksum => 'f2aaf16f5e421b97513c502c03c117fab6569076', checksum_type => 'sha1', extract => true, extract_path => '/opt', creates => "${install_path}/bin", cleanup => true, user => 'tomcat', group => 'tomcat', require => File[$install_path], } ``` or use an subscribing exec to chmod the directory afterwards: ```puppet $dirname = 'apache-tomcat-9.0.0.M3' $filename = "${dirname}.zip" $install_path = "/opt/${dirname}" file { '/opt/tomcat': ensure => 'link', target => $install_path } archive { $filename: path => "/tmp/${filename}", source => "http://www-eu.apache.org/dist/tomcat/tomcat-9/v9.0.0.M3/bin/apache-tomcat-9.0.0.M3.zip", checksum => 'f2aaf16f5e421b97513c502c03c117fab6569076', checksum_type => 'sha1', extract => true, extract_path => '/opt', creates => $install_path, cleanup => 'true', require => File[$install_path], } exec { 'tomcat permission': command => "chown tomcat:tomcat $install_path", path => $path, subscribe => Archive[$filename], } ``` ### Network files For large binary files that needs to be extracted locally, instead of copying the file from the network fileshare, simply set the file path to be the same as the source and archive will use the network file location: ```puppet archive { '/nfs/repo/software.zip': source => '/nfs/repo/software.zip' extract => true, extract_path => '/opt', checksum_type => 'none', # typically unecessary cleanup => false, # keep the file on the server } ``` ### Extract Customization The `extract_flags` or `extract_command` parameters can be used to override the default extraction command/flag (defaults are specified in [achive.rb](lib/puppet_x/bodeco/archive.rb)). ```puppet # tar striping directories: archive { '/var/lib/kafka/kafka_2.10-0.8.2.1.tgz': ensure => present, extract => true, extract_command => 'tar xfz %s --strip-components=1', extract_path => '/opt/kafka_2.10-0.8.2.1', cleanup => true, creates => '/opt/kafka_2.10-0.8.2.1/config', } # zip freshen existing files (zip -of %s instead of zip -o %s): archive { '/var/lib/example.zip': extract => true, extract_path => '/opt', extract_flags => '-of', cleanup => true, subscribe => ..., } ``` ### S3 bucket S3 support is implemented via the [AWS CLI](http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html). On non-Windows systems, the `archive` class will install this dependency when the `aws_cli_install` parameter is set to `true`: ```puppet class { 'archive': aws_cli_install => true, } # See AWS cli guide for credential and configuration settings: # http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html file { '/root/.aws/credentials': ensure => file, ... } file { '/root/.aws/config': ensure => file, ... } archive { '/tmp/gravatar.png': ensure => present, source => 's3://bodecoio/gravatar.png', } ``` NOTE: Alternative s3 provider support can be implemented by overriding the [s3_download method](lib/puppet/provider/archive/ruby.rb): ### GS bucket GSUtil support is implemented via the [GSUtil Package](https://cloud.google.com/storage/docs/gsutil). On non-Windows systems, the `archive` class will install this dependency when the `gsutil_install` parameter is set to `true`: ```puppet class { 'archive': gsutil_install => true, } # See Google Cloud SDK cli guide for credential and configuration settings: # https://cloud.google.com/storage/docs/quickstart-gsutil archive { '/tmp/gravatar.png': ensure => present, source => 'gs://bodecoio/gravatar.png', } ``` ### Download customizations In some cases you may need custom flags for curl/wget/s3/gsutil which can be supplied via `download_options`. Since this parameter is provider specific, beware of the order of defaults: * s3:// files accepts aws cli options ```puppet archive { '/tmp/gravatar.png': ensure => present, source => 's3://bodecoio/gravatar.png', download_options => ['--region', 'eu-central-1'], } ``` * puppet `provider` override: ```puppet archive { '/tmp/jta-1.1.jar': ensure => present, source => 'http://central.maven.org/maven2/javax/transaction/jta/1.1/jta-1.1.jar', provider => 'wget', download_options => '--continue', } ``` * Linux default provider is `curl`, and Windows default is `ruby` (no effect). This option can also be applied globally to address issues for specific OS: ```puppet if $facts['osfamily'] != 'RedHat' { Archive { download_options => '--tlsv1', } } ``` ### Migrating from puppet-staging It is recommended to use puppet-archive instead of puppet-staging. Users wishing to migrate may find the following examples useful. #### puppet-staging (without extraction) ```puppet class { 'staging': path => '/tmp/staging', } staging::file { 'master.zip': source => 'https://github.com/voxpupuli/puppet-archive/archive/master.zip', } ``` #### puppet-archive (without extraction) ```puppet archive { '/tmp/staging/master.zip': source => 'https://github.com/voxpupuli/puppet-archive/archive/master.zip', } ``` #### puppet-staging (with zip file extraction) ```puppet class { 'staging': path => '/tmp/staging', } staging::file { 'master.zip': source => 'https://github.com/voxpupuli/puppet-archive/archive/master.zip', } -> staging::extract { 'master.zip': target => '/tmp/staging/master.zip', creates => '/tmp/staging/puppet-archive-master', } ``` #### puppet-archive (with zip file extraction) ```puppet archive { '/tmp/staging/master.zip': source => 'https://github.com/voxpupuli/puppet-archive/archive/master.zip', extract => true, extract_path => '/tmp/staging', creates => '/tmp/staging/puppet-archive-master', cleanup => false, } ``` ## Reference ### Classes * `archive`: install 7zip package (Windows only) and aws cli or gsutil for s3/gs support. It also permits passing an `archives` argument to generate `archive` resources. * `archive::staging`: install package dependencies and creates staging directory for backwards compatibility. Use the archive class instead if you do not need the staging directory. ### Define Resources * `archive::artifactory`: archive wrapper for [JFrog Artifactory](http://www.jfrog.com/open-source/#os-arti) files with checksum. * `archive::go`: archive wrapper for [GO Continuous Delivery](http://www.go.cd/) files with checksum. * `archive::nexus`: archive wrapper for [Sonatype Nexus](http://www.sonatype.org/nexus/) files with checksum. * `archive::download`: archive wrapper and compatibility shim for [camptocamp/archive](https://forge.puppet.com/camptocamp/archive). This is considered private API, as it has to change with camptocamp/archive. For this reason it will remain undocumented, and removed when no longer needed . We suggest not using it directly. Instead please consider migrating to archive itself where possible. ### Resources #### Archive * `ensure`: whether archive file should be present/absent (default: present) * `path`: namevar, archive file fully qualified file path. * `filename`: archive file name (derived from path). * `source`: archive file source, supports http|https|ftp|file|s3|gs uri. * `username`: username to download source file. * `password`: password to download source file. -* `cacert_file`: Path to a CA certificate bundle. * `allow_insecure`: Ignore HTTPS certificate errors (true|false). (default: false) * `cookie`: archive file download cookie. * `checksum_type`: archive file checksum type (none|md5|sha1|sha2|sha256|sha384| sha512). (default: none) * `checksum`: archive file checksum (match checksum_type) * `checksum_url`: archive file checksum source (instead of specify checksum) * `checksum_verify`: whether checksum will be verified (true|false). (default: true) * `extract`: whether archive will be extracted after download (true|false). (default: false) * `extract_path`: target folder path to extract archive. * `extract_command`: custom extraction command ('tar xvf example.tar.gz'), also support sprintf format ('tar xvf %s') which will be processed with the filename: sprintf('tar xvf %s', filename) * `temp_dir`: Specify an alternative temporary directory to use for copying files, if unset then the operating system default will be used. * `extract_flags`: custom extraction options, this replaces the default flags. A string such as 'xvf' for a tar file would replace the default xf flag. A hash is useful when custom flags are needed for different platforms. {'tar' => 'xzf', '7z' => 'x -aot'}. * `user`: extract command user (using this option will configure the archive file permission to 0644 so the user can read the file). * `group`: extract command group (using this option will configure the archive file permission to 0644 so the user can read the file). * `cleanup`: whether archive file will be removed after extraction (true|false). (default: true) * `creates`: if file/directory exists, will not download/extract archive. * `proxy_server`: specify a proxy server, with port number if needed. ie: `https://example.com:8080`. * `proxy_type`: proxy server type (none|http|https|ftp) #### Archive::Artifactory * `path`: fully qualified filepath for the download the file or use archive_path and only supply filename. (namevar). * `ensure`: ensure the file is present/absent. * `url`: artifactory download url filepath. NOTE: replaces server, port, url_path parameters. * `server`: artifactory server name (deprecated). * `port`: artifactory server port (deprecated). * `url_path`: artifactory file path `http:://{server}:{port}/artifactory/{url_path}` (deprecated). * `owner`: file owner (see archive params for defaults). * `group`: file group (see archive params for defaults). * `mode`: file mode (see archive params for defaults). * `archive_path`: the parent directory of local filepath. * `extract`: whether to extract the files (true/false). * `creates`: the file created when the archive is extracted (true/false). * `cleanup`: remove archive file after file extraction (true/false). #### Archive::Artifactory Example ```puppet $dirname = 'gradle-1.0-milestone-4-20110723151213+0300' $filename = "${dirname}-bin.zip" archive::artifactory { $filename: archive_path => '/tmp', url => "http://repo.jfrog.org/artifactory/distributions/org/gradle/${filename}", extract => true, extract_path => '/opt', creates => "/opt/${dirname}", cleanup => true, } file { '/opt/gradle': ensure => link, target => "/opt/${dirname}", } ``` #### Archive::Nexus #### Archive::Nexus Example ```puppet archive::nexus { '/tmp/jtstand-ui-0.98.jar': url => 'https://oss.sonatype.org', gav => 'org.codehaus.jtstand:jtstand-ui:0.98', repository => 'codehaus-releases', packaging => 'jar', extract => false, } ``` ## Development We highly welcome new contributions to this module, especially those that include documentation, and rspec tests ;) but will happily guide you through the process, so, yes, please submit that pull request! Note: If you are writing a dependent module that include specs in it, you will need to set the puppetversion fact in your puppet-rspec tests. You can do that by adding it to the default facts of your spec/spec_helper.rb: ```ruby RSpec.configure do |c| c.default_facts = { :puppetversion => Puppet.version } end ``` diff --git a/lib/puppet/provider/archive/curl.rb b/lib/puppet/provider/archive/curl.rb index 707fcca..87d207f 100644 --- a/lib/puppet/provider/archive/curl.rb +++ b/lib/puppet/provider/archive/curl.rb @@ -1,77 +1,76 @@ require 'uri' require 'tempfile' Puppet::Type.type(:archive).provide(:curl, parent: :ruby) do commands curl: 'curl' defaultfor feature: :posix def curl_params(params) if resource[:username] if resource[:username] =~ %r{\s} || resource[:password] =~ %r{\s} Puppet.warning('Username or password contains a space. Unable to use netrc file to hide credentials') account = [resource[:username], resource[:password]].compact.join(':') params += optional_switch(account, ['--user', '%s']) else create_netrcfile params += ['--netrc-file', @netrc_file.path] end end params += optional_switch(resource[:proxy_server], ['--proxy', '%s']) params += ['--insecure'] if resource[:allow_insecure] params += resource[:download_options] if resource[:download_options] params += optional_switch(resource[:cookie], ['--cookie', '%s']) - params += optional_switch(resource[:cacert_file], ['--cacert', '%s']) params end def create_netrcfile @netrc_file = Tempfile.new('.puppet_archive_curl') machine = URI.parse(resource[:source]).host @netrc_file.write("machine #{machine}\nlogin #{resource[:username]}\npassword #{resource[:password]}\n") @netrc_file.close end def delete_netrcfile return if @netrc_file.nil? @netrc_file.unlink @netrc_file = nil end def download(filepath) params = curl_params( [ resource[:source], '-o', filepath, '-fsSLg', '--max-redirs', 5 ] ) begin curl(params) ensure delete_netrcfile end end def remote_checksum params = curl_params( [ resource[:checksum_url], '-fsSLg', '--max-redirs', 5 ] ) begin curl(params)[%r{\b[\da-f]{32,128}\b}i] ensure delete_netrcfile end end end diff --git a/lib/puppet/provider/archive/ruby.rb b/lib/puppet/provider/archive/ruby.rb index 5f75285..573f43a 100644 --- a/lib/puppet/provider/archive/ruby.rb +++ b/lib/puppet/provider/archive/ruby.rb @@ -1,262 +1,261 @@ begin require 'puppet_x/bodeco/archive' require 'puppet_x/bodeco/util' rescue LoadError require 'pathname' # WORK_AROUND #14073 and #7788 archive = Puppet::Module.find('archive', Puppet[:environment].to_s) raise(LoadError, "Unable to find archive module in modulepath #{Puppet[:basemodulepath] || Puppet[:modulepath]}") unless archive require File.join archive.path, 'lib/puppet_x/bodeco/archive' require File.join archive.path, 'lib/puppet_x/bodeco/util' end require 'securerandom' require 'tempfile' # This provider implements a simple state-machine. The following attempts to # # document it. In general, `def adjective?` implements a [state], while `def # verb` implements an {action}. # Some states are more complex, as they might depend on other states, or trigger # actions. Since this implements an ad-hoc state-machine, many actions or states # have to guard themselves against being called out of order. # # [exists?] # | # v # [extracted?] -> no -> [checksum?] # | # v # yes # | # v # [path.exists?] -> no -> {cleanup} # | | | # v v v # [checksum?] yes. [extracted?] && [cleanup?] # | # v # {destroy} # # Now, with [exists?] defined, we can define [ensure] # But that's just part of the standard puppet provider state-machine: # # [ensure] -> absent -> [exists?] -> no. # | | # v v # present yes # | | # v v # [exists?] {destroy} # | # v # {create} # # Here's how we would extend archive for an `ensure => latest`: # # [exists?] -> no -> {create} # | # v # yes # | # v # [ttl?] -> expired -> {destroy} -> {create} # | # v # valid. # Puppet::Type.type(:archive).provide(:ruby) do optional_commands aws: 'aws' optional_commands gsutil: 'gsutil' defaultfor feature: :microsoft_windows attr_reader :archive_checksum def exists? return checksum? unless extracted? return checksum? if File.exist? archive_filepath cleanup true end def create transfer_download(archive_filepath) unless checksum? extract cleanup end def destroy FileUtils.rm_f(archive_filepath) if File.exist?(archive_filepath) end def archive_filepath resource[:path] end def tempfile_name if resource[:checksum] == 'none' "#{resource[:filename]}_#{SecureRandom.base64}" else "#{resource[:filename]}_#{resource[:checksum]}" end end def creates if resource[:extract] == :true extracted? ? resource[:creates] : 'archive not extracted' else resource[:creates] end end def creates=(_value) extract end def checksum resource[:checksum] || (resource[:checksum] = remote_checksum if resource[:checksum_url]) end def remote_checksum PuppetX::Bodeco::Util.content( resource[:checksum_url], username: resource[:username], password: resource[:password], cookie: resource[:cookie], proxy_server: resource[:proxy_server], proxy_type: resource[:proxy_type], insecure: resource[:allow_insecure] )[%r{\b[\da-f]{32,128}\b}i] end # Private: See if local archive checksum matches. # returns boolean def checksum?(store_checksum = true) return false unless File.exist? archive_filepath return true if resource[:checksum_type] == :none archive = PuppetX::Bodeco::Archive.new(archive_filepath) archive_checksum = archive.checksum(resource[:checksum_type]) @archive_checksum = archive_checksum if store_checksum checksum == archive_checksum end def cleanup return unless extracted? && resource[:cleanup] == :true Puppet.debug("Cleanup archive #{archive_filepath}") destroy end def extract return unless resource[:extract] == :true raise(ArgumentError, 'missing archive extract_path') unless resource[:extract_path] PuppetX::Bodeco::Archive.new(archive_filepath).extract( resource[:extract_path], custom_command: resource[:extract_command], options: resource[:extract_flags], uid: resource[:user], gid: resource[:group] ) end def extracted? resource[:creates] && File.exist?(resource[:creates]) end def transfer_download(archive_filepath) if resource[:temp_dir] && !File.directory?(resource[:temp_dir]) raise Puppet::Error, "Temporary directory #{resource[:temp_dir]} doesn't exist" end tempfile = Tempfile.new(tempfile_name, resource[:temp_dir]) temppath = tempfile.path tempfile.close! case resource[:source] when %r{^(puppet)} puppet_download(temppath) when %r{^(http|ftp)} download(temppath) when %r{^file} uri = URI(resource[:source]) FileUtils.copy(Puppet::Util.uri_to_path(uri), temppath) when %r{^s3} s3_download(temppath) when %r{^gs} gs_download(temppath) when nil raise(Puppet::Error, 'Unable to fetch archive, the source parameter is nil.') else raise(Puppet::Error, "Source file: #{resource[:source]} does not exists.") unless File.exist?(resource[:source]) FileUtils.copy(resource[:source], temppath) end # conditionally verify checksum: if resource[:checksum_verify] == :true && resource[:checksum_type] != :none archive = PuppetX::Bodeco::Archive.new(temppath) actual_checksum = archive.checksum(resource[:checksum_type]) if actual_checksum != checksum destroy(temppath) raise(Puppet::Error, "Download file checksum mismatch (expected: #{checksum} actual: #{actual_checksum})") end end move_file_in_place(temppath, archive_filepath) end def move_file_in_place(from, to) # Ensure to directory exists. FileUtils.mkdir_p(File.dirname(to)) FileUtils.mv(from, to) end def download(filepath) PuppetX::Bodeco::Util.download( resource[:source], filepath, username: resource[:username], password: resource[:password], cookie: resource[:cookie], proxy_server: resource[:proxy_server], proxy_type: resource[:proxy_type], - insecure: resource[:allow_insecure], - cacert_file: resource[:cacert_file] + insecure: resource[:allow_insecure] ) end def puppet_download(filepath) PuppetX::Bodeco::Util.puppet_download( resource[:source], filepath ) end def s3_download(path) params = [ 's3', 'cp', resource[:source], path ] params += resource[:download_options] if resource[:download_options] aws(params) end def gs_download(path) params = [ 'cp', resource[:source], path ] params += resource[:download_options] if resource[:download_options] gsutil(params) end def optional_switch(value, option) if value option.map { |flags| flags % value } else [] end end end diff --git a/lib/puppet/provider/archive/wget.rb b/lib/puppet/provider/archive/wget.rb index 6de34af..b360f81 100644 --- a/lib/puppet/provider/archive/wget.rb +++ b/lib/puppet/provider/archive/wget.rb @@ -1,46 +1,45 @@ Puppet::Type.type(:archive).provide(:wget, parent: :ruby) do commands wget: 'wget' def wget_params(params) username = Shellwords.shellescape(resource[:username]) if resource[:username] password = Shellwords.shellescape(resource[:password]) if resource[:password] params += optional_switch(username, ['--user=%s']) params += optional_switch(password, ['--password=%s']) params += optional_switch(resource[:cookie], ['--header="Cookie: %s"']) params += optional_switch(resource[:proxy_server], ['-e use_proxy=yes', "-e #{resource[:proxy_type]}_proxy=#{resource[:proxy_server]}"]) params += ['--no-check-certificate'] if resource[:allow_insecure] params += resource[:download_options] if resource[:download_options] - params += optional_switch(resource[:cacert_file], ['--ca-certificate=%s']) params end def download(filepath) params = wget_params( [ Shellwords.shellescape(resource[:source]), '-O', filepath, '--max-redirect=5' ] ) # NOTE: # Do NOT use wget(params) until https://tickets.puppetlabs.com/browse/PUP-6066 is resolved. command = "wget #{params.join(' ')}" Puppet::Util::Execution.execute(command) end def remote_checksum params = wget_params( [ '-qO-', Shellwords.shellescape(resource[:checksum_url]), '--max-redirect=5' ] ) command = "wget #{params.join(' ')}" Puppet::Util::Execution.execute(command)[%r{\b[\da-f]{32,128}\b}i] end end diff --git a/lib/puppet/type/archive.rb b/lib/puppet/type/archive.rb index b67d835..8a0c2ff 100644 --- a/lib/puppet/type/archive.rb +++ b/lib/puppet/type/archive.rb @@ -1,297 +1,288 @@ require 'pathname' require 'uri' require 'puppet/util' require 'puppet/parameter/boolean' Puppet::Type.newtype(:archive) do @doc = 'Manage archive file download, extraction, and cleanup.' ensurable do desc 'whether archive file should be present/absent (default: present)' newvalue(:present) do provider.create end newvalue(:absent) do provider.destroy end defaultto(:present) # The following changes allows us to notify if the resource is being replaced def is_to_s(value) # rubocop:disable Style/PredicateName return "(#{resource[:checksum_type]})#{provider.archive_checksum}" if provider.archive_checksum super end def should_to_s(value) return "(#{resource[:checksum_type]})#{resource[:checksum]}" if provider.archive_checksum super end def change_to_s(currentvalue, newvalue) if currentvalue == :absent || currentvalue.nil? extract = resource[:extract] == :true ? "and extracted in #{resource[:extract_path]}" : '' cleanup = resource[:cleanup] == :true ? 'with cleanup' : 'without cleanup' if provider.archive_checksum "replace archive: #{provider.archive_filepath} from #{is_to_s(currentvalue)} to #{should_to_s(newvalue)}" else "download archive from #{resource[:source]} to #{provider.archive_filepath} #{extract} #{cleanup}" end elsif newvalue == :absent "remove archive: #{provider.archive_filepath} " else super end rescue StandardError super end end newparam(:path, namevar: true) do desc 'namevar, archive file fully qualified file path.' validate do |value| unless Puppet::Util.absolute_path? value raise ArgumentError, "archive path must be absolute: #{value}" end end end newparam(:filename) do desc 'archive file name (derived from path).' end newparam(:extract) do desc 'whether archive will be extracted after download (true|false).' newvalues(:true, :false) defaultto(:false) end newparam(:extract_path) do desc 'target folder path to extract archive.' validate do |value| unless Puppet::Util.absolute_path? value raise ArgumentError, "archive extract_path must be absolute: #{value}" end end end newparam(:target) do desc 'target folder path to extract archive. (this parameter is for camptocamp/archive compatibility)' validate do |value| unless Puppet::Util.absolute_path? value raise ArgumentError, "archive extract_path must be absolute: #{value}" end end munge do |val| resource[:extract_path] = val end end newparam(:extract_command) do desc "custom extraction command ('tar xvf example.tar.gz'), also support sprintf format ('tar xvf %s') which will be processed with the filename: sprintf('tar xvf %s', filename)" end newparam(:temp_dir) do desc 'Specify an alternative temporary directory to use for copying files, if unset then the operating system default will be used.' validate do |value| unless Puppet::Util.absolute_path?(value) raise ArgumentError, "Invalid temp_dir #{value}" end end end newparam(:extract_flags) do desc "custom extraction options, this replaces the default flags. A string such as 'xvf' for a tar file would replace the default xf flag. A hash is useful when custom flags are needed for different platforms. {'tar' => 'xzf', '7z' => 'x -aot'}." defaultto(:undef) end newproperty(:creates) do desc 'if file/directory exists, will not download/extract archive.' def should_to_s(value) "extracting in #{resource[:extract_path]} to create #{value}" end end newparam(:cleanup) do desc 'whether archive file will be removed after extraction (true|false).' newvalues(:true, :false) defaultto(:true) end newparam(:source) do desc 'archive file source, supports puppet|http|https|ftp|file|s3|gs uri.' validate do |value| unless value =~ URI.regexp(%w[puppet http https ftp file s3 gs]) || Puppet::Util.absolute_path?(value) raise ArgumentError, "invalid source url: #{value}" end end end newparam(:url) do desc 'archive file source, supports http|https|ftp|file uri. (for camptocamp/archive compatibility)' validate do |value| unless value =~ URI.regexp(%w[http https file ftp]) raise ArgumentError, "invalid source url: #{value}" end end munge do |val| resource[:source] = val end end newparam(:cookie) do desc 'archive file download cookie.' end newparam(:checksum) do desc 'archive file checksum (match checksum_type).' newvalues(%r{\b[0-9a-f]{5,128}\b}, :true, :false, :undef, nil, '') munge do |val| if val.nil? || val.empty? || val == :undef :false elsif [:true, :false].include? val resource[:checksum_verify] = val else val end end end newparam(:digest_string) do desc 'archive file checksum (match checksum_type) (this parameter is for camptocamp/archive compatibility).' newvalues(%r{\b[0-9a-f]{5,128}\b}) munge do |val| if !val.nil? && !val.empty? resource[:checksum] = val else val end end end newparam(:checksum_url) do desc 'archive file checksum source (instead of specifying checksum)' end newparam(:digest_url) do desc 'archive file checksum source (instead of specifying checksum) (this parameter is for camptocamp/archive compatibility)' munge do |val| resource[:checksum_url] = val end end newparam(:checksum_type) do desc 'archive file checksum type (none|md5|sha1|sha2|sha256|sha384|sha512).' newvalues(:none, :md5, :sha1, :sha2, :sha256, :sha384, :sha512) defaultto(:none) end newparam(:digest_type) do desc 'archive file checksum type (none|md5|sha1|sha2|sha256|sha384|sha512) (this parameter is camptocamp/archive compatibility).' newvalues(:none, :md5, :sha1, :sha2, :sha256, :sha384, :sha512) munge do |val| resource[:checksum_type] = val end end newparam(:checksum_verify) do desc 'whether checksum wil be verified (true|false).' newvalues(:true, :false) defaultto(:true) end newparam(:username) do desc 'username to download source file.' end newparam(:password) do desc 'password to download source file.' end newparam(:user) do desc 'extract command user (using this option will configure the archive file permission to 0644 so the user can read the file).' end newparam(:group) do desc 'extract command group (using this option will configure the archive file permisison to 0644 so the user can read the file).' end newparam(:proxy_type) do desc 'proxy type (none|ftp|http|https)' newvalues(:none, :ftp, :http, :https) end newparam(:proxy_server) do desc 'proxy address to use when accessing source' end newparam(:allow_insecure, boolean: true, parent: Puppet::Parameter::Boolean) do desc 'ignore HTTPS certificate errors' defaultto :false end - newparam(:cacert_file) do - desc 'path to a custom CA certificate bundle file' - validate do |value| - if !value.nil? && !Puppet::Util.absolute_path?(value) - raise ArgumentError, "cacert_file must be absolute: #{value}" - end - end - end - newparam(:download_options) do desc 'provider download options (affects curl, wget, gs, and only s3 downloads for ruby provider)' validate do |val| unless val.is_a?(::String) || val.is_a?(::Array) raise ArgumentError, "download_options should be String or Array: #{val}" end end munge do |val| case val when ::String [val] else val end end end autorequire(:file) do [ Pathname.new(self[:path]).parent.to_s, self[:extract_path], '/root/.aws/config', '/root/.aws/credentials' ].compact end autorequire(:exec) do ['install_aws_cli'] end autorequire(:exec) do ['install_gsutil'] end validate do filepath = Pathname.new(self[:path]) self[:filename] = filepath.basename.to_s if !self[:source].nil? && !self[:url].nil? && self[:source] != self[:url] raise ArgumentError, "invalid parameter: url (#{self[:url]}) and source (#{self[:source]}) are mutually exclusive." end if !self[:checksum_url].nil? && !self[:digest_url].nil? && self[:checksum_url] != self[:digest_url] raise ArgumentError, "invalid parameter: checksum_url (#{self[:checksum_url]}) and digest_url (#{self[:digest_url]}) are mutually exclusive." end if self[:proxy_server] self[:proxy_type] ||= URI(self[:proxy_server]).scheme.to_sym else self[:proxy_type] = :none end end end diff --git a/lib/puppet_x/bodeco/util.rb b/lib/puppet_x/bodeco/util.rb index 1a94b5b..d0d26f6 100644 --- a/lib/puppet_x/bodeco/util.rb +++ b/lib/puppet_x/bodeco/util.rb @@ -1,189 +1,180 @@ module PuppetX module Bodeco module Util def self.download(url, filepath, options = {}) uri = URI(url) @connection = PuppetX::Bodeco.const_get(uri.scheme.upcase).new("#{uri.scheme}://#{uri.host}:#{uri.port}", options) @connection.download(uri, filepath) end def self.content(url, options = {}) uri = URI(url) @connection = PuppetX::Bodeco.const_get(uri.scheme.upcase).new("#{uri.scheme}://#{uri.host}:#{uri.port}", options) @connection.content(uri) end # # This allows you to use a puppet syntax for a file and return its content. # # @example # puppet_download 'puppet:///modules/my_module_name/my_file.dat # # @param [String] url this is the puppet url of the file to be fetched # @param [String] filepath this is path of the file to create # # @raise [ArgumentError] when the file doesn't exist # def self.puppet_download(url, filepath) # Somehow there is no consistent way to determine what terminus to use. So we switch to a # trial and error method. First we start withe the default. And if it doesn't work, we try the # other ones status = load_file_with_any_terminus(url) raise ArgumentError, "Previous error(s) resulted in Puppet being unable to retrieve information from environment #{Puppet['environment']} source(s) #{url}'\nMost probable cause is file not found." unless status File.open(filepath, 'w') { |file| file.write(status.content) } end # @private # rubocop:disable HandleExceptions def self.load_file_with_any_terminus(url) termini_to_try = [:file_server, :rest] termini_to_try.each do |terminus| with_terminus(terminus) do begin content = Puppet::FileServing::Content.indirection.find(url) rescue SocketError, Timeout::Error, Errno::ECONNREFUSED, Errno::EHOSTDOWN, Errno::EHOSTUNREACH, Errno::ETIMEDOUT, Puppet::HTTP::RouteError # rescue any network error end return content if content end end nil end # rubocop:enable HandleExceptions def self.with_terminus(terminus) old_terminus = Puppet[:default_file_terminus] Puppet[:default_file_terminus] = terminus value = yield Puppet[:default_file_terminus] = old_terminus value end end class HTTP require 'net/http' FOLLOW_LIMIT = 5 URI_UNSAFE = %r{[^\-_.!~*'()a-zA-Z\d;\/?:@&=+$,\[\]%]} def initialize(_url, options) @username = options[:username] @password = options[:password] @cookie = options[:cookie] @insecure = options[:insecure] if options[:proxy_server] uri = URI(options[:proxy_server]) unless uri.scheme uri = URI("#{options[:proxy_type]}://#{options[:proxy_server]}") end @proxy_addr = uri.hostname @proxy_port = uri.port end - @cacert_file = if options.key?[:cacert_file] - options[:cacert_file] - elsif ENV.key?('SSL_CERT_FILE') - ENV['SSL_CERT_FILE'] - elsif Facter.value(:osfamily) == 'windows' - File.join(__dir__, 'cacert.pem') - end + ENV['SSL_CERT_FILE'] = File.expand_path(File.join(__FILE__, '..', 'cacert.pem')) if Facter.value(:osfamily) == 'windows' && !ENV.key?('SSL_CERT_FILE') end def generate_request(uri) header = @cookie && { 'Cookie' => @cookie } request = Net::HTTP::Get.new(uri.request_uri, header) request.basic_auth(@username, @password) if @username && @password request end def follow_redirect(uri, option = { limit: FOLLOW_LIMIT }, &block) http_opts = if uri.scheme == 'https' { use_ssl: true, verify_mode: (@insecure ? OpenSSL::SSL::VERIFY_NONE : OpenSSL::SSL::VERIFY_PEER) } else { use_ssl: false } end - - http_opts[:ca_file] = @cacert_file unless @cacert_file.nil? - Net::HTTP.start(uri.host, uri.port, @proxy_addr, @proxy_port, http_opts) do |http| http.request(generate_request(uri)) do |response| case response when Net::HTTPSuccess yield response when Net::HTTPRedirection limit = option[:limit] - 1 raise Puppet::Error, "Redirect limit exceeded, last url: #{uri}" if limit < 0 location = safe_escape(response['location']) new_uri = URI(location) new_uri = URI(uri.to_s + location) if new_uri.relative? follow_redirect(new_uri, limit: limit, &block) else raise Puppet::Error, "HTTP Error Code #{response.code}\nURL: #{uri}\nContent:\n#{response.body}" end end end end def download(uri, file_path, option = { limit: FOLLOW_LIMIT }) follow_redirect(uri, option) do |response| File.open file_path, 'wb' do |io| response.read_body do |chunk| io.write chunk end end end end def content(uri, option = { limit: FOLLOW_LIMIT }) follow_redirect(uri, option) do |response| return response.body end end def safe_escape(uri) uri.to_s.gsub(URI_UNSAFE) do |match| '%' + match.unpack('H2' * match.bytesize).join('%').upcase end end end class HTTPS < HTTP end class FTP require 'net/ftp' def initialize(url, options) uri = URI(url) username = options[:username] password = options[:password] proxy_server = options[:proxy_server] proxy_type = options[:proxy_type] ENV["#{proxy_type}_proxy"] = proxy_server @ftp = Net::FTP.new @ftp.connect(uri.host, uri.port) if username @ftp.login(username, password) else @ftp.login end end def download(uri, file_path) @ftp.getbinaryfile(uri.path, file_path) end end class FILE def initialize(_url, _options) end def download(uri, file_path) FileUtils.copy(uri.path, file_path) end end end end diff --git a/spec/unit/puppet/provider/archive/curl_spec.rb b/spec/unit/puppet/provider/archive/curl_spec.rb index d6d01d2..85704f2 100644 --- a/spec/unit/puppet/provider/archive/curl_spec.rb +++ b/spec/unit/puppet/provider/archive/curl_spec.rb @@ -1,204 +1,189 @@ require 'spec_helper' curl_provider = Puppet::Type.type(:archive).provider(:curl) RSpec.describe curl_provider do it_behaves_like 'an archive provider', curl_provider describe '#download' do let(:name) { '/tmp/example.zip' } let(:resource) { Puppet::Type::Archive.new(resource_properties) } let(:provider) { curl_provider.new(resource) } let(:tempfile) { Tempfile.new('mock') } let(:default_options) do [ 'http://home.lan/example.zip', '-o', String, '-fsSLg', '--max-redirs', 5 ] end before do allow(FileUtils).to receive(:mv) allow(provider).to receive(:curl) allow(Tempfile).to receive(:new).with('.puppet_archive_curl').and_return(tempfile) end context 'no extra properties specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip' } end it 'calls curl with input, output and --max-redirects=5' do provider.download(name) expect(provider).to have_received(:curl).with(default_options) end end context 'username and password specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', username: 'foo', password: 'bar' } end it 'populates temp netrc file with credentials' do allow(provider).to receive(:delete_netrcfile) # Don't delete the file or we won't be able to examine its contents. provider.download(name) nettc_content = File.open(tempfile.path).read expect(nettc_content).to eq("machine home.lan\nlogin foo\npassword bar\n") end it 'calls curl with default options and path to netrc file' do netrc_filepath = tempfile.path provider.download(name) expect(provider).to have_received(:curl).with(default_options << '--netrc-file' << netrc_filepath) end it 'deletes netrc file' do netrc_filepath = tempfile.path provider.download(name) expect(File.exist?(netrc_filepath)).to eq(false) end context 'with password containing space' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', username: 'foo', password: 'b ar' } end it 'calls curl with default options and username and password on command line' do provider.download(name) expect(provider).to have_received(:curl).with(default_options << '--user' << 'foo:b ar') end end end context 'allow_insecure true' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', allow_insecure: true } end it 'calls curl with default options and --insecure' do provider.download(name) expect(provider).to have_received(:curl).with(default_options << '--insecure') end end context 'cookie specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', cookie: 'foo=bar' } end it 'calls curl with default options cookie' do provider.download(name) expect(provider).to have_received(:curl).with(default_options << '--cookie' << 'foo=bar') end end context 'using proxy' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', proxy_server: 'https://home.lan:8080' } end it 'calls curl with proxy' do provider.download(name) expect(provider).to have_received(:curl).with(default_options << '--proxy' << 'https://home.lan:8080') end end describe '#checksum' do subject { provider.checksum } let(:url) { nil } let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip' } end before do resource[:checksum_url] = url if url end context 'with a url' do let(:curl_params) do [ 'http://example.com/checksum', '-fsSLg', '--max-redirs', 5 ] end let(:url) { 'http://example.com/checksum' } context 'responds with hash' do let(:remote_hash) { 'a0c38e1aeb175201b0dacd65e2f37e187657050a' } it 'parses checksum value' do allow(provider).to receive(:curl).with(curl_params).and_return("a0c38e1aeb175201b0dacd65e2f37e187657050a README.md\n") expect(provider.checksum).to eq('a0c38e1aeb175201b0dacd65e2f37e187657050a') end end end end describe 'custom options' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', download_options: ['--tlsv1'] } end it 'calls curl with custom tls options' do provider.download(name) expect(provider).to have_received(:curl).with(default_options << '--tlsv1') end end - - context 'using cacert_file' do - let(:resource_properties) do - { - name: name, - source: 'http://home.lan/example.zip', - cacert_file: '/custom-ca-bundle.pem' - } - end - - it 'calls curl with --cacert' do - provider.download(name) - expect(provider).to have_received(:curl).with(default_options << '--cacert' << '/custom-ca-bundle.pem') - end - end end end diff --git a/spec/unit/puppet/provider/archive/wget_spec.rb b/spec/unit/puppet/provider/archive/wget_spec.rb index b3362e4..5d50d6e 100644 --- a/spec/unit/puppet/provider/archive/wget_spec.rb +++ b/spec/unit/puppet/provider/archive/wget_spec.rb @@ -1,171 +1,156 @@ require 'spec_helper' wget_provider = Puppet::Type.type(:archive).provider(:wget) RSpec.describe wget_provider do it_behaves_like 'an archive provider', wget_provider describe '#download' do let(:name) { '/tmp/example.zip' } let(:resource) { Puppet::Type::Archive.new(resource_properties) } let(:provider) { wget_provider.new(resource) } let(:execution) { Puppet::Util::Execution } let(:default_options) do [ 'wget', 'http://home.lan/example.zip', '-O', '/tmp/example.zip', '--max-redirect=5' ] end before do allow(FileUtils).to receive(:mv) allow(execution).to receive(:execute) end context 'no extra properties specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip' } end it 'calls wget with input, output and --max-redirects=5' do provider.download(name) expect(execution).to have_received(:execute).with(default_options.join(' ')) end end context 'username specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', username: 'foo' } end it 'calls wget with default options and username' do provider.download(name) expect(execution).to have_received(:execute).with([default_options, '--user=foo'].join(' ')) end end context 'password specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', password: 'foo' } end it 'calls wget with default options and password' do provider.download(name) expect(execution).to have_received(:execute).with([default_options, '--password=foo'].join(' ')) end end context 'cookie specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', cookie: 'foo' } end it 'calls wget with default options and header containing cookie' do provider.download(name) expect(execution).to have_received(:execute).with([default_options, '--header="Cookie: foo"'].join(' ')) end end context 'proxy specified' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', proxy_server: 'https://home.lan:8080' } end it 'calls wget with default options and header containing cookie' do provider.download(name) expect(execution).to have_received(:execute).with([default_options, '-e use_proxy=yes', '-e https_proxy=https://home.lan:8080'].join(' ')) end end context 'allow_insecure true' do let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip', allow_insecure: true } end it 'calls wget with default options and --no-check-certificate' do provider.download(name) expect(execution).to have_received(:execute).with([default_options, '--no-check-certificate'].join(' ')) end end describe '#checksum' do subject { provider.checksum } let(:url) { nil } let(:resource_properties) do { name: name, source: 'http://home.lan/example.zip' } end before do resource[:checksum_url] = url if url end context 'with a url' do let(:wget_params) do [ 'wget', '-qO-', 'http://example.com/checksum', '--max-redirect=5' ] end let(:url) { 'http://example.com/checksum' } context 'responds with hash' do let(:remote_hash) { 'a0c38e1aeb175201b0dacd65e2f37e187657050a' } it 'parses checksum value' do allow(Puppet::Util::Execution).to receive(:execute).with(wget_params.join(' ')).and_return("a0c38e1aeb175201b0dacd65e2f37e187657050a README.md\n") expect(provider.checksum).to eq('a0c38e1aeb175201b0dacd65e2f37e187657050a') end end end end - - context 'with cacert_file' do - let(:resource_properties) do - { - name: name, - source: 'http://home.lan/example.zip', - cacert_file: '/custom-ca-bundle.pem' - } - end - - it 'calls wget with default options and --ca-certificate' do - provider.download(name) - expect(execution).to have_received(:execute).with([default_options, '--ca-certificate=/custom-ca-bundle.pem'].join(' ')) - end - end end end diff --git a/spec/unit/puppet/type/archive_spec.rb b/spec/unit/puppet/type/archive_spec.rb index c7ec528..0864c40 100644 --- a/spec/unit/puppet/type/archive_spec.rb +++ b/spec/unit/puppet/type/archive_spec.rb @@ -1,169 +1,162 @@ require 'spec_helper' require 'puppet' describe Puppet::Type.type(:archive) do let(:resource) do Puppet::Type.type(:archive).new( path: '/tmp/example.zip', source: 'http://home.lan/example.zip' ) end context 'resource defaults' do it { expect(resource[:path]).to eq '/tmp/example.zip' } it { expect(resource[:name]).to eq '/tmp/example.zip' } it { expect(resource[:filename]).to eq 'example.zip' } it { expect(resource[:extract]).to eq :false } it { expect(resource[:cleanup]).to eq :true } it { expect(resource[:checksum_type]).to eq :none } it { expect(resource[:digest_type]).to eq nil } it { expect(resource[:checksum_verify]).to eq :true } it { expect(resource[:extract_flags]).to eq :undef } it { expect(resource[:allow_insecure]).to eq false } it { expect(resource[:download_options]).to eq nil } it { expect(resource[:temp_dir]).to eq nil } - it { expect(resource[:cacert_file]).to eq nil } end it 'verify resource[:path] is absolute filepath' do expect do resource[:path] = 'relative/file' end.to raise_error(Puppet::Error, %r{archive path must be absolute: }) end it 'verify resource[:temp_dir] is absolute filetemp_dir' do expect do resource[:temp_dir] = 'relative/file' end.to raise_error(Puppet::Error, %r{Invalid temp_dir}) end - it 'verify resource[:cacert_file] is absolute path' do - expect do - resource[:cacert_file] = 'relative/file' - end.to raise_error(Puppet::Error, %r{cacert_file must be absolute}) - end - describe 'on posix', if: Puppet.features.posix? do it 'accepts valid resource[:source]' do expect do resource[:source] = 'http://home.lan/example.zip' resource[:source] = 'https://home.lan/example.zip' resource[:source] = 'ftp://home.lan/example.zip' resource[:source] = 's3://home.lan/example.zip' resource[:source] = 'gs://home.lan/example.zip' resource[:source] = '/tmp/example.zip' end.not_to raise_error end %w[ afp://home.lan/example.zip \tmp D:/example.zip ].each do |s| it 'rejects invalid resource[:source]' do expect do resource[:source] = s end.to raise_error(Puppet::Error, %r{invalid source url: }) end end end describe 'on windows', if: Puppet.features.microsoft_windows? do it 'accepts valid windows resource[:source]' do expect do resource[:source] = 'D:/example.zip' end.not_to raise_error end %w[ /tmp/example.zip \Z: ].each do |s| it 'rejects invalid windows resource[:source]' do expect do resource[:source] = s end.to raise_error(Puppet::Error, %r{invalid source url: }) end end end %w[ 557e2ebb67b35d1fddff18090b6bc26b 557e2ebb67b35d1fddff18090b6bc26557e2ebb67b35d1fddff18090b6bc26bb ].each do |cs| it 'accepts valid resource[:checksum]' do expect do resource[:checksum] = cs end.not_to raise_error end end %w[ z57e2ebb67b35d1fddff18090b6bc26b 557e ].each do |cs| it 'rejects bad checksum' do expect do resource[:checksum] = cs end.to raise_error(Puppet::Error, %r{Invalid value}) end end it 'accepts valid resource[:checksum_type]' do expect do [:none, :md5, :sha1, :sha2, :sha256, :sha384, :sha512].each do |type| resource[:checksum_type] = type end end.not_to raise_error end it 'rejects invalid resource[:checksum_type]' do expect do resource[:checksum_type] = :crc32 end.to raise_error(Puppet::Error, %r{Invalid value}) end it 'verify resource[:allow_insecure] is valid' do expect do [:true, :false, :yes, :no].each do |type| resource[:allow_insecure] = type end end.not_to raise_error end it 'verify resource[:download_options] is valid' do expect do ['--tlsv1', ['--region', 'eu-central-1']].each do |type| resource[:download_options] = type end end.not_to raise_error end describe 'archive autorequire' do let(:file_resource) { Puppet::Type.type(:file).new(name: '/tmp') } let(:archive_resource) do described_class.new( path: '/tmp/example.zip', source: 'http://home.lan/example.zip' ) end let(:auto_req) do catalog = Puppet::Resource::Catalog.new catalog.add_resource file_resource catalog.add_resource archive_resource archive_resource.autorequire end it 'creates relationship' do expect(auto_req.size).to be 1 end it 'links to archive resource' do expect(auto_req[0].target).to eql archive_resource end it 'autorequires parent directory' do expect(auto_req[0].source).to eql file_resource end end end