Skip navigation
All Places > Education > Blog


7 posts

AngularJS expressions are like JavaScript expressions with the following differences:

  • Context: JavaScript expressions are evaluated against the global window. In AngularJS, expressions are evaluated against a scope object.

  • Forgiving: In JavaScript, trying to evaluate undefined properties generates ReferenceError or TypeError. In AngularJS, expression evaluation is forgiving to undefined and null.

  • Filters: You can use filters within expressions to format data before displaying it.

  • No Control Flow Statements: You cannot use the following in an AngularJS expression: conditionals, loops, or exceptions.

  • No Function Declarations: You cannot declare functions in an AngularJS expression, even inside ng-init directive.

  • No RegExp Creation With Literal Notation: You cannot create regular expressions in an AngularJS expression. An exception to this rule is ng-pattern which accepts valid RegExp.

  • No Object Creation With New Operator: You cannot use new operator in an AngularJS expression.

  • No Bitwise, Comma, And Void Operators: You cannot use Bitwise, , or void operators in an AngularJS expression. For more Additional info Angular Course

If you want to run more complex JavaScript code, you should make it a controller method and call the method from your view. If you want to eval() an AngularJS expression yourself, use the $eval() method.


AngularJS does not use JavaScript's eval() to evaluate expressions. Instead AngularJS's $parse service processes these expressions.

AngularJS expressions do not have direct access to global variables like window, document or location. This restriction is intentional. It prevents accidental access to the global state – a common source of subtle bugs.

Instead use services like $window and $location in functions on controllers, which are then called from expressions. Such services provide mockable access to globals.

It is possible to access the context object using the identifier this and the locals object using the identifier $locals.

<div class="example2" ng-controller="ExampleController"> Name: <input ng-model="name" type="text"/> <button ng-click="greet()">Greet</button> <button ng-click="window.alert('Should not see me')">Won't greet</button></div>


Expression evaluation is forgiving to undefined and null. In JavaScript, evaluating a.b.c throws an exception if a is not an object. While this makes sense for a general purpose language, the expression evaluations are primarily used for data binding, which often look like this:


It makes more sense to show nothing than to throw an exception if a is undefined (perhaps we are waiting for the server response, and it will become defined soon). If expression evaluation wasn't forgiving we'd have to write bindings that clutter the code, for example: {{((a||{}).b||{}).c}}

Similarly, invoking a function a.b.c() on undefined or null simply returns undefined. Learn more knowledge from Angular Certification

No Control Flow Statements

Apart from the ternary operator (a ? b : c), you cannot write a control flow statement in an expression. The reason behind this is core to the AngularJS philosophy that application logic should be in controllers, not the views. If you need a real conditional, loop, or to throw from a view expression, delegate to a JavaScript method instead.

No function declarations or RegExp creation with literal notation

You can't declare functions or create regular expressions from within AngularJS expressions. This is to avoid complex model transformation logic inside templates. Such logic is better placed in a controller or in a dedicated filter where it can be tested properly.


Directives like ngClick and ngFocus expose a $event object within the scope of that expression. The object is an instance of a jQuery Event Object when jQuery is present or a similar jqLite object.

<div ng-controller="EventController">
<button ng-click="clickMe($event)">Event</button> <p><code>$event</code>: <pre> {{$event | json}}</pre></p> <p><code>clickEvent</code>: <pre>{{clickEvent | json}}</pre></p></div>

One-time binding

An expression that starts with :: is considered a one-time expression. One-time expressions will stop recalculating once they are stable, which happens after the first digest if the expression result is a non-undefined value 

<div ng-controller="EventController"> <button ng-click="clickMe($event)">Click Me</button> <p id="one-time-binding-example">One time binding: {{::name}}</p> <p id="normal-binding-example">Normal binding: {{name}}</p></div>

Reasons for using one-time binding

The main purpose of one-time binding expression is to provide a way to create a binding that gets deregistered and frees up resources once the binding is stabilized. Reducing the number of expressions being watched makes the digest loop faster and allows more information to be displayed at the same time. Grow your career success with AngularJS Online Training

Value stabilization algorithm

One-time binding expressions will retain the value of the expression at the end of the digest cycle as long as that value is not undefined. If the value of the expression is set within the digest loop and later, within the same digest loop, it is set to undefined, then the expression is not fulfilled and will remain watched.

  1. Given an expression that starts with ::, when a digest loop is entered and expression is dirty-checked, store the value as V
  2. If V is not undefined, mark the result of the expression as stable and schedule a task to deregister the watch for this expression when we exit the digest loop
  3. Process the digest loop as normal
  4. When digest loop is done and all the values have settled, process the queue of watch deregistration tasks. For each watch to be deregistered, check if it still evaluates to a value that is not undefined. If that's the case, deregister the watch. Otherwise, keep dirty-checking the watch in the future digest loops by following the same algorithm starting from step 1

Scenario: Direct Access from Silverlight to Dynamics CRM 2011 Online


The standard approach to our case is to host an application as a Windows Azure web site. The web site will host a Silverlight application. The Silverlight application will access a MS CRM 2011 Online and will grab a data. There is the only one weakness in this plan: CRM Online didn’t publish cross domain policy files like crossdomain.xml and clientaccesspolicy.xml. There are no tools to somehow manage this or upload the files as resources. This does mean that you are not able to connect to CRM Online using Silverlight, instead you host it within the CRM. But such scenario requires that all your visitors were registered as CRM users what is not possible for internet-facing application. Let's work around this problem.



Scenario: Access to CRM 2011 Online Organization Service from a Web Server


The approach with accessing CRM from a web server component requires some extra work. First, we need to provide a WCF RIA service for the Silverlight application. This service will wrap a call of CRM Organization service. Additionally, it could be used to increase the security of the application and restrict an API access.



For that scenario, it is required to have a Windows Identity Foundation installed on a server. As expected, there is no WIF installed in a cloud. So you need to add a reference on Microsoft.IdentityModel.dll (C:\Program Files\Reference Assemblies\Microsoft\Windows Identity Foundation\v3.5) in a project with parameter CopyLocal = true. The code for interaction with CRM uses proxy classes from SDK and entities classes generated by CrmSvcUtil.exe. Class DeviceIdManager is also available in SDK samples (sdk\samplecode\cs\helpercode):


string userName = "<windows live>";
string password = "<live password>";

ClientCredentials credentials = new ClientCredentials();
credentials.UserName.UserName = userName;
credentials.UserName.Password = password;

Uri organizationUri =
new Uri(@"");
Uri homeRealmUri = null;
Uri issuerUri = new Uri(@"");

string deviceName, devicePassword;

DeviceIdManager.PersistToFile = true;
ClientCredentials cred = DeviceIdManager.LoadDeviceCredentials(issuerUri);
deviceName = cred.UserName.UserName;
devicePassword = cred.UserName.Password;


DeviceIdManager.PersistToFile = false;
deviceName = "cvrmd6i7y6fozei5renofkmt";
devicePassword = "-r~-~pe`3ecWZ+ExW3Kb%F#Z";


ClientCredentials deviceCredentials =
DeviceIdManager.LoadOrRegisterDevice(issuerUri, deviceName, devicePassword);
OrganizationServiceProxy proxy =
new OrganizationServiceProxy(organizationUri, homeRealmUri, credentials, deviceCredentials);

Xrm.XrmServiceContext context = new Xrm.XrmServiceContext(proxy);
techart_growerapplication gapp = new techart_growerapplication();
gapp.techart_firstname = app.FirstName;
gapp.techart_FamilyName = app.LastName;
gapp.techart_SecondName = app.SecondName;
gapp.EmailAddress = app.Email;




The important notes for this code are as given below:


1. The exact URI for issuer in your case can be found in WSDL for Organization service under:
Hide Copy Code



or you can use WsdlTokenManger class demonstrated in SDK (sdk\samplecode\cs\wsdlbasedproxies\online).


2. The call of EnableProxyTypes is mandatory. You will receive an exception without it:
Hide Copy Code


The formatter threw an exception while trying to deserialize the message:
There was an error while trying to deserialize parameter
The InnerException message was 'Error in line 1 position 8997.
Element ''
contains data from a type that maps to the name 'Xrm:techart_application'.
The deserializer has no knowledge of any type that maps to this name.
Consider changing the implementation of the ResolveName method on your DataContractResolver
to return a non-null value for name 'techart_application' and namespace 'Xrm'.'.
Please see InnerException for more details.


3. Set the DeviceIdManager.PersistToFile = false and device name and password is mandatory in order to make it working on Windows Azure. Device ID will be registered in Windows Live. Windows Azure does not support storing the user or machine level files, that is why we should restrict the storing of the Device ID. As you can see, it is only required for release environment on Windows Azure.


So, this scenario will allow you to implement the required behaviour.


Scenario: Using Windows Azure Service Bus and ACS to Interact with Dynamics CRM 2011 Online


This scenario allows you to use all benefits of the Microsoft cloud platform. Dynamics CRM 2011 Online has an internal support for integration using Azure Service Bus. Commonly, you can download a certificate from CRM and use it to maintain a trusted relationships with another application through Service Bus.



So, in general, this is it. The main issues will rise as always during the implementation of the solutions. But currently, Azure provides spectacular tools which allows you to deliver a solution as quick as possible and does not worry about the hosting environment maintenance and support. Azure Service Bus could be expensive for a small company, but it is a good tool for middle size organizations. I must admit that the current implementation of the Service Bus is far from enterprise level product and you should consider other available products such as MS BizTalk On-premises, Oracle Service Bus or Tibco EAI. But I expect that in two years, it will become a real pearl for integration projects.

In order to get our new customers acclimated as quickly as possible, we offer a FREE monthly Jumpstart session on the first Thursday of every month. This free, one-hour webinar will ensure that you have everything you need to get started with the Turbonomic platform.  Expert instructors will introduce the technology and show you how Turbonomic will ensure application QoS while maximizing the efficiency of your environment.


All new customers will benefit from this session. Visit the Education page to see the upcoming schedule or you can go directly to the registration page to sign up and pick a convenient date.


Topics covered include:

  • Using Turbonomic actions to assure quality of service while maximizing efficiency
  • Preventing problems in your virtual environment before they occur
  • Investigating how infrastructure changes would affect performance
  • Running what-if scenarios, such as workload fluctuations and hardware migrations, to identify the best workload placement and necessary resources
  • Reclaiming unused storage
  • Customizing Turbonomic dashboards and reports


Register now!

Hi Green Circle-ians!


Did you know that we have Free Trainings the first Monday of every month?  Register for future sessions Here:


If you missed the training session yesterday or just want to re-watch the recording please see below!



The session will introduce you to the breakthrough technology that drives VMTurbo Operations Manager, and demonstrate how it assures application performance while utilizing virtualized and cloud infrastructure resources as efficiently as possible.

Who will Benefit: enterprise and virtualization architects, capacity planners, system administrators, and IT and support personnel. There is no limit to who can participate from your organization.

Some of the key lessons they will take away include:

  • Preventing problems in your virtual environment before they occur
  • Using VMTurbo recommendations for workload placement to maximize efficiency
  • Exploring the wealth of information to gain visibility and insight into your virtual environment
  • Using VMTurbo to analyze historical trends
  • Customizing the VMTurbo dashboards
  • Planning for projected workload fluctuations and hardware migrations
  • Investigating how infrastructure changes would affect performance and efficiency


We have other training options as well including On Site training, and personalized 4 HR remote training:


Email if your interested in learning more - or - as always, email our Customer Experience Team - and they can help you out as well!!


Economics Links

Posted by david.fiore Expert Jul 9, 2014

At the core, our product harnesses market forces to solve a very real and very complex problem: how to coordinate disparate bits of knowledge into action with the goal of optimal resource allocation (resources going to their highest valued good).  There has been an ongoing debate in the field of economics and in our country as to how to answer this question.  Some would say that the solution is to have wise governors act as knowledge clearinghouses, whose job it would be to allocate resources in the most efficient way possible. This approach may be called the Central Planning approach, which in its most extreme form is socialism or communism.  Others say the knowledge is too vast and too widely dispersed for any group of people to harness it effectively for the larger society.  Instead, resource allocation happens best when free markets are allowed to flourish and autonomous individuals, acting in their own best interests, compete for resources.

Here is an entertaining video which outlines some of the big themes of both positions.

Our software, much like the world we live in, operates on the principles of the free market (The Market), with some regulation (policies, constraints) thrown in.  Just like in the real world, we would have better resource allocation if it weren't for the political constraints that hamper the free market and, we must take those political factors into account or we won't be able to act at all.

For those who would like to get an introduction to economics, I have a few resources to recommend, listed below.  I recommend taking them in order:  Try I, Pencil first.  If that whets your appetite for more, go on to Economics In One Lesson, and so on.

For those who prefer to learn by listening rather than by reading, may I suggest the Econtalk podcast.  My favorites are the ones where his guest is Mike Munger, but there are lots of great episodes out there.  Here are a few to get you started:

This one is a little tongue-in-cheek:

Explore VMTurbo Operations Manager training offerings. Here:

This training video was updated on February 25, 2014 and introduces the VMTurbo Technology



To watch the video on Vimeo click here