Valentin Hamburger

Unlock the power of UCP Director - Create vRA cloud services - Part 3

Blog Post created by Valentin Hamburger Employee on Oct 14, 2015

Gentlemen, start your scripting

Ok, in part 2 we prepared what to do and even started with the workflow itself, by putting in a check if

the desired datastore name might be already taken. So far so good. No lets get it on with the rest of the

workflow to finally form the REST call to create and attach a volume to a cluster.

 

Get the storage ID

Get Storage System icon.png

In part 2 we decided to automatically look up the storage system ID. This is done by doing another UCP Director

REST call and storing the output in an attribute called StorageID.

The call is constructed like this:


GET https://ucpmanagement.ucp.local/api/storagesystems HTTP/1.1

Source: API reference guide page 870


Hint:  If you put all this in your REST client in the browser of your choice, you can watch this function going through. This is a very

          easy way of testing your data / REST calls. But remember - there will be no "are you sure" question - REST is very powerful! Be sure

          to only use "non harming" functions for "wild" testing

 

By issuing the simple “GET” call you will receive all attached storage systems. Currently UCP Director supports

only one storage system, so it will be easy to identify. Now we need to use a bit of javascript power to store the

output in the variable StorageID.

 

Lets create a scripted task to do the actual REST call and store the information in an attribute. Just drag a “scripted task” element

in your workflow chain. I always like to name it right after its function: Get Storage System.

GetSTorageIDVariables.png

Now we need to define the "IN" and "OUT" variables for the scripted task. These variables are either predefined at the "IN" variables

section of the workflow itself (such as the restHost), or Attributes as “errorCode” or “StorageID”. Now let’s write the java script code
for issuing the REST call.
Make sure “StorageID” is set in the “OUT” tab at the scripted task – this is important for the next step!

 

First part:

//Prepare the REST URL and method
requestType = "POST";
requestContentType = "application/json";
var operationUrl = "/storagesystems";
requestContent = "";

//In Parameters – (not needed to fill for a GET operation
var param_1 = "";
var param_2 = "";
var inParamtersValues = [param_1, param_2];

//build the REST operation
var op = new RESTOperation("CAV");
op.method = requestType;
op.urlTemplate = operationUrl;
op.defaultContentType = requestContentType;
var operation = restHost.addOperation(op);
var request = operation.createRequest(inParamtersValues, requestContent);


In this first part of the scripted task we create the parameters for the REST call. The call will be constructed using the “RESTOperation”

method of javascript. It is important that you only need the REST url after the "/api" space. So all we need to add is "'/storagesystems".

This is due to the fact that we already added the REST host in part 2 of this guide. Also, the beautiful pink highlighted variables are those

we added to the scripted task in the IN or OUT tab. This makes them easy to spot and check if you have specified them all correct.

 

Now to the second part of the scripted task:

//Execute the request

var response = request.execute();

//Define response in answer variable as string
answer = response.contentAsString;
//Create an JSON object with the response content
var crArray = JSON.parse(answer);
//Set the storage system ID
StorageID = crArray[0].Id;

//set error code in case something goes wrong
errorCode = response.statusCode;
//Log the REST URL for debugging
System.log("Request: " + request.fullUrl);

 

Ok – that’s it. In this code block we issue the actual request to the UCP Director by calling request.execute(). With that call we

simultaneously receive the response and store it in an attribute called answer as string.

To enable search functionalities on this data we than transform it into an object. This makes our life easier if we search for specific

values in a REST response.

Since we want to get the storage system ID, we simply access it by addressing the first element [0] called “Id” in the answer object.

For debugging reasons, we also log the full REST URL in the workflows log tab (by using the “System.log” vRO command).

Bam – now we got our Storage system ID.


Hint:  If you wonder where we got the idea that the first element is our storage ID, that's simple. Run the REST GET method in your

          browser window and you will receive the "response" with all its content. Now you can look how the field is called and in which

          position you might find it. It's that easy.

 

OK – the good news is, the first part of the REST call is always the same, all you need to do is changing the requestType, operationUrl

and requestContent to issue another call. All the other lines can be “re-used” by copying it to a new scripted task element.

 

Get the Pool ID

PoolID icon.pngWe also decided to get the storage Pool ID automatically, to make the service useable for a broad audience.
For simplicity, we only select the first Pool ID which comes back. If you are in a production environment, you could also

create a method to use a specific Pool (like HDT) or the pool with the highest free space available. Be creative – orchestrator is your friend

To receive the Pool ID we are using, once again, the get storage systems REST call. But this time we specify the storage
system by its ID to get the pool details.

 

The REST call looks like this:


GET https://ucpmanagement.ucp.local/api/storagesystems/123456/pools HTTP/1.1

Source: API reference guide page 888

 

The call is very similar to the former “Get Storage ID”, so we just copy the scripted task, rename it to "Get Pool ID" and only change

what is needed to complete the call. Be sure to add the StorageID attribute to the “IN” tab and the Pool_ID attribute to the “OUT” tab,

as well as remove any unneeded attributes in both sections.

 

Change the following lines:

var operationUrl = "/storagesystems/"+ StorageID +”/pools”;

//Set the Pool system ID
Pool_ID = crArray[0].Id;

 

Since we use the same javascript to issue the REST call, you only need to change these two lines.
Make sure that you also change the variable binding at the IN and
OUT section – this is very important for the next step!

 

Bringing it all together – create and attach the volume

Create and Attach Storage icon.png


So far so good. We are done with the prework and can now issue the REST call which will create the volume in the storage,
do the FC zoning,
takes care of the mapping to the hosts and formats the volume as new VMFS datastore.

All with a simple rest command to UCP Director!

 

Back to the create and attach REST API call we are now ready to execute. Therefore we use our well known scripted task once again,

but this time we need to add also a request body. Bind the SizeInGB, Cluster, StorageID and the Pool_ID as IN parameters to the

scripted task.There is no OUT parameter to specify.

 

Remember the REST call should look like this (as described in part 2 of this series):

 

POST https://ucpmanagement.your.domain/api/clusters/domain-c1234/createandattachvolume
HTTP/1.1 Content-Type: application/json; charset=utf-8

Request Body:
{
"PoolId": "1",

"VolumeSizeInBytes": 214748364800,
"ShouldFormat": true,
"StorageSystemId": "123456",
"StorageSystemPortIds": null,
"VolumeName": "OurNewDS"

}

 

We copy the "Get Pool ID" scripted task, rename it to "Create and Attach Volume" and change/add the following lines in the script section.

We need to fill also the requestContent attribute with the specified values to form the needed request body.


var operationUrl = "/storagesystems/"+ Cluster.id +”/createandattachvolume”;

//Recalculate the Size in Bytes
var SizeInB = SizeInGB*1024*1024*1024;

This time we need to fill also the requestContent variable with the specified values as described by the “request Body” section in the API guide.
Since we are working with a string variable in javascript, we also need to take care to comment out any special characters. For easier readability
we split the string over multiple lines:

 

requestContent = “{“; +

      “\"PoolId\”: \"" + Pool_ID + "\", "; +

      "\",\"VolumeSizeInBytes\": \"" + SizeInB.toString() + "\",  "; +

      "\"ShouldFormat\": " +true+ ", "; +

      "\"StorageSystemId\": " + StorageID + ", "; +

      "\"StorageSystemPortIds\": " + null + ","; +

      "\"VolumeName\": \"" + DSName + "\""; +

"}";

 

It might look weired in the first place, but it ensoures that the content is formatted as the REST API would expect it, including variable types

(see the toString() method applied). The “\” character disables any control character in a text string such as the quote sign.

 

That is all you need to change in the final request, all the other code can stay the same as in the previous calls. Once you fire that to

the UCP Director it will go ahead and do its magic to create and attach a volume to the specified cluster.

 

Our finished workflow should look something like this:

Workflow overview.png

Thre red icons are exceptions, everytime something goes wrong those are triggered. That’s why we

created the errorCode variable. Bind it to the “Exception” area in each scripted task and than simply

connect the script task to the red icon. You workflow will also work without that, but if something might

go wrong, you will not know what it was :/

 

That’s it!

-> start the workflow

-> select the UCP director REST host

-> select the cluster to add to

-> specify the size in GB


A few minutes afterwards you have your new datastore available – just like that.

No Zoning

No mapping of storage ports

No “add datastore “ manual task

 

Just a simple workflow with three little inputs. That’s the true power of UCP director in a cloud environment!

 

Even better: In vRA go to "Extended Services" and just add the workflow to your services. This makes it

super easy to run by any approved user or group:

Screen Shot 2015-10-13 at 15.21.29.png

 

If you want to see all this and more in action - visit our booth P302 at VMworld Barcelona!!

Have fun creating a powerful software defined datacenter with UCP Director!

Outcomes