Monday 29 April 2013

Chef Experiments - Create Users



The objective here is to create  a users cookbook with data bags

Create the data bag

knife data bag create user_config

Create the user json file

data_bags/users/usr_sri.json 

"id": "sri",
{

    "comment": "Sriram Rajan",

    "uid": 2000,

    "gid": 0,

    "home":"/home/sri",

    "shell":"/bin/bash",

    "pubkey":"<replace with the SSH public key"

}


Import the file

knife data bag from file users_config usr_sri.json


Create a key for the encrypted data bag

openssl rand -base64 512 > data_bags/users/enckey


Create the encrypted data bag

knife data bag create --secret-file  data_bags/users/enckey password_config pwdlist



Edit the data bag

knife data bag edit --secret-file  data_bags/users/enckey password_config pwdlist


"id": "pwdlist",
{
"sri": "Replace with SHA password string"
}


At this point you should have a data bag with users and encrypted data bag with passwords. Now we move to the cookbook

Create the cookbook

knife cookbook create user_config

Recipe looks like this. We add the user and also ensure the .ssh directory is created and populated with the public keys. The password will be pulled from the encrypted bag.


decrypted = Chef::EncryptedDataBagItem.load("password_config", "pwdlist")
search(:user_config, "*:*").each do |user_data|
    user user_data['id'] do
        comment user_data['comment']
        uid user_data['uid']
        gid user_data['gid']
        home user_data['home']
        shell user_data['shell']
        manage_home true
        password decrypted[user_data['id']
        action:create
    end
  
    ssh_dir = user_data['home'] + "/.ssh"
    directory ssh_dir do
        owner user_data['uid']
        group user_data['gid']
        mode "0700"
    end

    template "#{ssh_dir}/authorized_keys" do
        owner user_data['uid']
        group user_data['gid']
        mode "0600"
        variables(
             :ssh_keys => user_data['pubkey']
             )
        source "authorized_keys.erb"
    end
end

The template file

base_users/templates/default/authorized_keys.erb 

<% Array(@ssh_keys).each do |key| %>

<%= key %>

<% end %>



Finishing up
knife cookbook upload user_config

Ensure the secret key for the encrypted data bag is also sent to the node and stored under  /etc/chef/encrypted_data_bag_secret. You can bootstrap this file into the node build. See http://docs.opscode.com/essentials_data_bags_encrypt.html

Then add the recipe to a role or node run list and run the chef-client to test.

Designing in the cloud


Service based model
This is not a very new concept (http://en.wikipedia.org/wiki/Service-oriented_architecture)  but the cloud model makes this very important.  Build your business model  such that it can be consumed as a service. This would also force you to modularize parts and all this would ensure you have a high degree of portability.


Build for failure
Cloud is multi-tenant in most cases and with it comes challenges like noisy neighbours or failure of individual components.  Build for these scenarios.  Symian army (http://techblog.netflix.com/2011/07/netflix-simian-army.html) talks a lot about this and is an interesting read.  Importantly plan for "What happens when"

In building for failure you are also creating a good recovery model.  One the benefits of running everything as code means that you can recover faster and this would translate into better uptime.

Cloud is all about the API and pluggabiliity. Think about  building a top level API for your business model. Then use vendor APIs and plug them into your API.  Wherever possible, loosely couple your application interaction. For e.g., instead of direct database calls use an API


Monitoring
Monitoring becomes more than just making sure your applications are working fine. If you leverage multiple cloud providers, you can use it to make operational decisions.  You can use it to go with the best cloud provider and save costs.  You can use it to find low performing instances within the same provider.  One important point here is to make sure your monitoring is vendor agnostic and wherever possible not a tool provided by the vendor. Frameworks like Sensu  (http://www.sonian.com/cloud-monitoring-sensu/)  or tools like Riemann(http://riemann.io) can help


Automation
Cloud will force automation to a large extent and you need to embrace it.  Automation also allows you to build across different vendors  When using multiple vendors, us a model that works on all platforms. There are open source libraries like libcloud which provide vendor agnostic ways.   Be careful with using automation providers as you can get vendor lock-in in a different way.  While building your own autoscale model is complex in the long term, there is a lot more to gain as it will fit your business model.


Think about data
Cloud provides commodity based services for things like compute, storage etc but your data is not commodity. So think about distributing this over different vendors or build that into your recovery model.


Think about security
Security in the cloud is a hot topic and it is safe to say that this is still evolving.  This is also something that is overlooked while you plug in other nuts and bolts.  Make sure things like identity management, access control models are at the heart of your cloud strategy.  Even if security is not an immediate requirement, you can build them as services which can be implemented at a later stage.