Welcome!

Java IoT Authors: Carmen Gonzalez, JP Morgenthal, Pat Romanski, Elizabeth White, Gerardo A Dada

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, Microsoft Cloud, Agile Computing, @CloudExpo

Containers Expo Blog: Tutorial

Building Vagrant Machines with Packer

Learn how to use Vagrant and Packer to build virtual machines for development

Building Vagrant Machines with Packer

Sharing a common development environment with everyone on your team is important. It is really hard though to keep the same dependencies, database versions and other systems in sync between different machines.

Vagrant is a great tool that helps with this and manage the lifecycle of a virtual machine. As nice as Vagrant is, provisioning machines with it has always been a pain. A couple of months ago Mitchell Hashimoto, the creator of Vagrant, launched Packer.

Packer lets you build Virtual Machine Images for different providers from one json file. You can use the same file and commands to build an image on AWS, Digital Ocean or for virtualbox and vagrant. This makes it possible to use exactly the same system for development which you then create in production.

In this blog post we will show you how you can use Packer to build your vagrant machines. In a follow up post we will focus on how we use Packer for building all of our Continuous Deployment Infrastructure.

Prerequisites for building Vagrant Machines
You need Virtualbox and Packer installed. Virtualbox provides packages for different Operating systems. Packer is even easier, just download the right zip for your system and unzip it into your PATH

Building your Virtual Machine with Packer
We've collected all the files necessary to build a Vagrant Machine with Packer in our Packer Example repository.

Packer uses builders, provisioners and post-processors as the main configuraition attributes. A builder can for example be virtualbox or AWS. A provisioner can be used to run different scripts. Post-processors can be run after the machine image is done. For example converting a Virtualbox image into a suitable image for vagrant is done in a post-processor.

Here is the main packer.json file. You can see the builder, provisioner and post-processor defined:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 { "provisioners": [ { "type": "shell", "scripts": [ "scripts/root_setup.sh" ], "override": { "virtualbox": { "execute_command": "echo 'vagrant' | sudo -S sh '{{ .Path }}'" } } }, { "type": "shell", "scripts": [ "scripts/setup.sh" ] } ], "builders": [ { "type": "virtualbox", "boot_command": [ "<esc><esc><enter><wait>", "/install/vmlinuz noapic preseed/url=http://{{ .HTTPIP }}:{{ .HTTPPort }}/preseed.cfg <wait>", "debian-installer=en_US auto locale=en_US kbd-chooser/method=us <wait>", "hostname={{ .Name }} <wait>", "fb=false debconf/frontend=noninteractive <wait>", "keyboard-configuration/modelcode=SKIP keyboard-configuration/layout=USA keyboard-configuration/variant=USA console-setup/ask_detect=false <wait>", "initrd=/install/initrd.gz -- <enter><wait>" ], "boot_wait": "4s", "guest_os_type": "Ubuntu_64", "http_directory": "http", "iso_checksum": "4d1a8b720cdd14b76ed9410c63a00d0e", "iso_checksum_type": "md5", "iso_url": "http://releases.ubuntu.com/13.10/ubuntu-13.10-server-amd64.iso", "ssh_username": "vagrant", "ssh_password": "vagrant", "ssh_port": 22, "ssh_wait_timeout": "10000s", "shutdown_command": "echo 'shutdown -P now' > shutdown.sh; echo 'vagrant'|sudo -S sh 'shutdown.sh'", "guest_additions_path": "VBoxGuestAdditions_{{.Version}}.iso", "headless": false, "virtualbox_version_file": ".vbox_version", "vboxmanage": [ [ "modifyvm", "{{.Name}}", "--memory", "2048" ], [ "modifyvm", "{{.Name}}", "--cpus", "4" ] ] } ], "post-processors": ["vagrant"] }

It builds for virtualbox and then exports it into vagrant. The http folder contains a preseed.cfg file that is necessary to set up Ubuntu.

In the scripts folder you can find a root_setup.sh and setup.sh scripts.

The root_setup.sh script sets up necessary packages and parameters for Vagrant:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 #!/bin/bash set -e # Updating and Upgrading dependencies sudo apt-get update -y -qq > /dev/null sudo apt-get upgrade -y -qq > /dev/null # Install necessary libraries for guest additions and Vagrant NFS Share sudo apt-get -y -q install linux-headers-$(uname -r) build-essential dkms nfs-common # Install necessary dependencies sudo apt-get -y -q install curl wget git tmux firefox xvfb vim # Setup sudo to allow no-password sudo for "admin" groupadd -r admin usermod -a -G admin vagrant cp /etc/sudoers /etc/sudoers.orig sed -i -e '/Defaults\s\+env_reset/a Defaults\texempt_group=admin' /etc/sudoers sed -i -e 's/%admin ALL=(ALL) ALL/%admin ALL=NOPASSWD:ALL/g' /etc/sudoers #Install Redis sudo apt-get -y -q install libjemalloc1 wget -q http://d7jrzzvab3wte.cloudfront.net/checkbot/deb/redis-server_2.6.13-1_amd64.deb sha1sum redis-server_2.6.13-1_amd64.deb | grep 'ab50cf037fd63e160946f8946b6d318cdf11800d' dpkg -i redis-server_2.6.13-1_amd64.deb rm redis-server_2.6.13-1_amd64.deb # Install required libraries for RVM and Ruby sudo apt-get -y -q install gawk libreadline6-dev zlib1g-dev libssl-dev libyaml-dev libsqlite3-dev sqlite3 autoconf libgdbm-dev libncurses5-dev automake libtool bison pkg-config libffi-dev libxml2-dev libxslt-dev libxml2 # Install Postgresql sudo apt-get -y -q install postgresql libpq-dev postgresql-contrib # Set Password to test for user postgres sudo -u postgres psql -c "ALTER USER postgres WITH PASSWORD 'test';"

The setup.sh script install different dependencies like ruby or redis to set up the virtual machine exactly how you need it:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 #!/bin/bash set -e echo "Instaling for rof" # Installing vagrant keys mkdir ~/.ssh chmod 700 ~/.ssh cd ~/.ssh wget --no-check-certificate 'https://raw.github.com/mitchellh/vagrant/master/keys/vagrant.pub' -O authorized_keys chmod 600 ~/.ssh/authorized_keys chown -R vagrant ~/.ssh # Node.js Setup wget --retry-connrefused -q -O - https://raw.github.com/creationix/nvm/master/install.sh | sh source ~/.nvm/nvm.sh nvm install 0.10.18 nvm alias default 0.10.18 echo "source ~/.nvm/nvm.sh" >> ~/.bash_profile # RVM Install wget --retry-connrefused -q -O - https://get.rvm.io | bash -s stable source /home/vagrant/.rvm/scripts/rvm rvm autolibs read-fail rvm install 2.0.0-p195 gem install bundler zeus

You don't have any limits what you can run in your virtual machine through these scripts.

Building the Machine
We've added a create_box script that makes it easy for you to get started

1 2 3 4 5 6 7 8 9 #!/bin/bash set -e #export PACKER_LOG=1 rm packer_virtualbox_virtualbox.box || true packer build -only=virtualbox packer.json vagrant box remove vagrant_machine || true vagrant box add vagrant_machine packer/packer_virtualbox_virtualbox.box

You will then see Virtualbox start up and build the machine

Packer Virtualbox

Run the script and it will create the packer machine and import it into vagrant. Then all you have to do is run

1 2 vagrant destroy vagrant up

And you have your development environment set up.

You can now get into the machine with vagrant ssh and start coding.

Codeship - A hosted Continuous Deployment platform for web applications

Conclusion
Vagrant is an incredibly powerful tool and together with Packer it is easy to build development environments for your whole team.

But this is only the beginning. Packer can go much further than just providing your development environment. We are currently implementing Packer as the tool to build all of our test infrastructure servers. This new set of tools is great for Immutable Infrastructureand Continuous Deployment so you can build more stable, secure and easy to change infrastructure than ever before.

Let us know in the comments how you use Packer and Vagrant. We are excited to hear your thoughts!

Additional Links


Download Efficiency in Development Workflows: A free eBook for Software Developers. This book will save you a lot of time and make you and your development team happy.

Go ahead and try Codeship for Continuous Integration and Continuous Deployment! Set up for your GitHub and BitBucket projects only takes 3 minutes. It's free!

More Stories By Manuel Weiss

I am the cofounder of Codeship – a hosted Continuous Integration and Deployment platform for web applications. On the Codeship blog we love to write about Software Testing, Continuos Integration and Deployment. Also check out our weekly screencast series 'Testing Tuesday'!

@ThingsExpo Stories
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal...
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, discussed how VPaaS enables you to move fast, creating scalable video experiences that reach your aud...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"At ROHA we develop an app called Catcha. It was developed after we spent a year meeting with, talking to, interacting with senior citizens watching them use their smartphones and talking to them about how they use their smartphones so we could get to know their smartphone behavior," explained Dave Woods, Chief Innovation Officer at ROHA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.