Quantcast
Channel: MSDN Blogs
Viewing all 12366 articles
Browse latest View live

Dynamics CRM 2015 Outlook 同期に関するホワイトペーパー その 3

$
0
0

みなさん、こんにちは。

前回に引き続き Outlook クライアントの同期に関するホワイトペーパーを紹介します。
本記事はシリーズもののため、前回の記事を見られていない方はご覧ください。

Dynamics CRM 2015 Outlook 同期に関するホワイトペーパー その 1
Dynamics CRM 2015 Outlook 同期に関するホワイトペーパー その 2

Microsoft Dynamics CRM: How it works documentation:   
http://www.microsoft.com/en-us/download/details.aspx?id=48718

今回は、Dynamics CRM 2013 / 2015 の特定のシナリオにおいて、
各コンポーネントやプロセスがどのように関わって同期されるかを紹介します。

Dynamics CRM 2013 / 2015 における同期処理

Outlook クライアントの同期は、フィルターで定義した条件に基づき同期されます。
最初に CRM から Outlook にレコードが複製され、その後 CRM または Outlook 上で
変更されたものは、システムフィルターの条件に基づき同期されます。

Outlook クライアントでアイテムを変更

以下の図は、同期されている取引先担当者のレコードを Outlook クライアントで変更した際の
同期の流れを示しています。

変更は Outlook クライアントのローカルデータベースに格納され、同期処理のタイミングで
CRM に反映されます。
ポイントは、Outlook クライアントがローカルデータベースから
キャッシュをすべて読み込み終わる前に変更操作を行うと、ステップ 2 で検出されないため
同期されません。これらの問題に対処するため Outlook クライアントはアイテムの更新日時と
同期ステータスを記憶し、それらをもとに MAPI コンテンツを作成します。
これにより最後に同期されたのちに変更した操作も次回の同期対象となります。

image

Dynamics CRM でアイテムを変更

以下は、同期されているレコードを CRM でレコードを変更した場合の同期の流れを示しています。
Outlook クライアントからサーバー上の PrepareSync API が呼ばれ、最新の同期から変更された
アイテム情報が返却されます。
その後、Outlook クライアントのローカルデータベースに格納され、
Outlook クライアントに反映されます。

image

変更のマージ

Outlook クライアントの同期は Outlook → CRM、CRM → Outlook と双方向です。
同期が実行されるとそれぞれのアイテムは最新バージョンとして XML ファイルに記録されます。
アイテムが Outlook または CRM のいずれかで変更されると、変更日時から判断し、
XML ファイルと最新のアイテムの列を比較し変更が同期されます。
Outlook CRM の両方で変更が行われた場合、最新の変更が優先されそれ以前の変更を上書きします。

以下の例では、3 つのフィールドを変更した際のシナリオです。フィールド AB のように
Outlook と CRM の両方で変更された場合、同期が実行されると XML ファイルと比較が行われ
Outlook よりCRM 側の変更日時が新しいため CRM 側の変更が同期されます。
そのため、Outlook 側の変更は失われます。

image

アドレス帳の同期処理

Outlook クライアントでは、Outlook 上で電子メールや予定を作成する場合、
宛先に CRM 上の取引先担当者を設定することが出来ます。宛先に設定できるエンティティは、
既定では取引先担当者のみですが、取引先企業や潜在顧客も表示させることが可能です。

 

同期ルールまとめ

Outlook クライアント上の操作における同期のタイミングを示したものです。
Yes は操作してすぐに同期されることを意味しています。
Explorer は Outlook のビュー、
Inspector はビューからアイテムをダブルクリックして表示される画面を示しています。

image

* CRM からレコードを削除する場合、追跡しないボタンをクリックしたのちに出力される
ダイアログで明示的に削除を選択する必要があります。

同期に関連する設定

Outlook クライアントの設定(Outlook クライアントのファイル>CRM>オプション)

設定説明
Outlook 取引先担当者の [会社] フィールドを更新しますOutlook 取引先担当者の会社フィールドを更新する
このコンピューターを、Outlook とユーザーの主要な Microsoft Dynamics CRM 組織との間で同期を実行する、クライアントに設定する同期クライアントの設定
CRM​​ の項目を同期するための同期間隔を設定するOutlook との自動同期をスケジュール

 

CRM のシステム設定

設定説明
未承認のユーザーとキューの電子メール処理

電子メール アドレスがシステム管理者に承認されているユーザーおよびキューのみの電子メールの処理を許可する場合は、以下のチェック ボックスを選択します。
- 承認されたユーザーのみの電子メールを処理する
- 承認されたキューのみの電子メールを処理する

Exchange フォルダーに対するフォルダーレベルの追跡を使用Exchange 追跡フォルダーを設定し、メッセージをそれらのフォルダーに移動して、実質的にすべてのデバイスで自動的に追跡することができます。
CRM ユーザー間で送信された電子メールを 2 つの活動として追跡CRM​​ ユーザーの間で 2 つの電子メール活動、つまり送信者と受信者に電子メール活動が 1 つずつ作成されます。


その他、同期に関するよくある質問もガイドには記載されていますのでご覧ください。

まとめ

3 回に渡り Outlook 同期について紹介してきましたがいかがだったでしょうか。
Outlook クライアントは、日ごろからよく利用する Outlook から CRM をご利用いただける
利用者にとってメリットが大きいツールです。一方、Outlook クライアントの同期処理は
パフォーマンスに直結するため、適切な設定を意識する必要があります。
この記事を皆様に役に立つものとなることを期待しています。

- プレミアフィールドエンジニアリング 河野 高也


Dynamics CRM Online certification to Australian government security requirements

$
0
0

Six months ago we announced the important news confirming the Certification of Microsoft Office 365 and Microsoft Azure by the Australian Signals Directorate (ASD) via Australian Government’s Independent Registered Assessors Program (IRAP).

Australian government certification for Microsoft cloud services

This certification is for the handling of unclassified but sensitive data, known as Unclassified (DLM) within Australian government. This includes the majority of state and federal government data including private and personally identifiable information.

ASD established a Certified Cloud Services List and published a revised Information Security Manual (ISM) that provides a streamlined and effective way to verify the security of cloud services.

Increasingly, Australian Government agencies are turning to cloud services as a platform for innovation and efficiency. The Certified Cloud Services List streamlines the process by which those agencies can assure themselves of security and compliance by providing a Certification that can be trusted.

Australian government certification for Microsoft Dynamics CRM Online

Today, we have announced that Microsoft Dynamics CRM Online has also completed Certification to Australian Government security requirements for unclassified sensitive government data. This provides an assurance that any government agency can confidently adopt our cloud services, knowing that the security protections in place reach the high bar set by the Australian Signals Directorate.

Microsoft Dynamics CRM Online is the first customer relationship management solution able to provide this level of assurance in Australia, complementing our existing Certification of Office 365 and Microsoft Azure under the same program. Microsoft is the only cloud provider in Australia delivering a complete set of trusted cloud services covering Infrastructure-as-a-Service for computing, storage, database services and identity management; Platform-as-a-Service for modern applications; Software-as-a-Service for productivity and customer relationship management.

Demystifying data security

I think this certification is critically important to help you make appropriate decisions about using cloud services. We really aren’t short of extensive information on our cloud services, and the security and privacy of your data that your education institution stores there. We have heaps of information on the Office 365 Trust Center, the Microsoft Azure Trust Center and the Microsoft Dynamics CRM Trust Center. And there is no shortage of detailed information about our information security features (here’s a typical slide I picked up from one of our PowerPoint presentations, about 2048-bit encryption)

image

But often the leaders in education organisations don’t want to understand the details - what they want to know is the simple answer to the “Is it okay to use this?” question. And as an IT manager, that’s where the federal certification helps to simplify the conversation, because you can answer “Yes” confidently with the certification information. Whichever of our cloud services that your staff and students will be using - whether it’s the various Microsoft Azure services, Office 365 Education or Dynamics CRM Online, you’ve got the ring of confidence that comes from the certification (from ASD, who have possible the coolest strapline on their website Smile)

image

NB: All of these services are now also delivered from data centres within Australia, and we’re in the middle of migrating all of our Australian customers from other datacentres to the Australian ones. New customers are automatically deployed in the Australian data centres.

Learn MoreRead more about today's CRM Online certification announcement

Microsoft Dynamics CRM Online certified by the Australian Government for the handling of unclassified sensitive data

$
0
0

This morning we made an exciting announcement about Microsoft Dynamics CRM Online that can change the conversations you have with your Government agency prospects and customers.  Microsoft Dynamics CRM Online has completed Certification to Australian Government security requirements for unclassified sensitive government data via Australian Government’s Independent Registered Assessors Program (IRAP).  Microsoft Dynamics CRM Online is the first customer relationship management solution able to provide this level of assurance in Australia, joining Office 365 and Microsoft Azure Certifications under the same program.

This means that Government agencies can adopt Microsoft cloud services knowing that the security protections in place reach the high bar set by the Australian Signals Directorate.

To learn more read the full announcement from James Kavanagh, CTO, Microsoft Australia.

Questions that inspire action and GROWTH

$
0
0

Having spent some time today with a number of Microsoft’s distributors to help their people support the growth of their Partners, I thought I would share a number of the topics we explored in the form of some questions to ask yourselves…

The context for this meeting centred on business growth (profitable growth at that) and creating lasting value with customers. So, let’s jump right in and in no particular order (which is intentional)… 

  • To what extent do you feel comfortable that the services you provide today will be relevant to customers in 6-12 months?
  • How well do you feel that you balance your time working ON Vs IN your business (planning Vs executing)?
  • Could everyone in the business articulate your value proposition?
  • Why do customers buy from you?  Does that give you a competitive advantage and are you exploiting it as much as we could/should?
  • If you were to ask 10 business owners to share their top 3 frustrations today, do you have solutions that could help them?
  • How exposed would you feel if you lost your best sales guy / girl?
  • Of the new business you won this month, what percentage of this was generated from marketing activity and then, digital marketing activity?
  • How confident are you that prospective talent sees you as a destination employer?
  • What percentage of your existing team would recommend working for your organisation?
  • To what extent are the top team aligned in terms of the goal of the business?
  • How many new customers did you win this month? How is it trending over the last 6 months and is that sustainable?
  • How much profit did your organisation generate in the last 6 months outside of typical business hours?
  • What solutions / services will you be selling in 6 months’ time?
  • How many of your last 10 customer wins referred a new prospect to you?
  • How are your people feeling right now?

Perhaps by reading through the above questions and thinking of your answers this may help to inspire new thoughts or perhaps it triggered an aha or eureka moment… I will wait to hear…

CAPI2 code that will try to translate a CSP handle into a CNG handle

$
0
0

This blog post is with respect to CAPI2 and CNG.

We might encounter situations where in our CAPI2 code we see that the CSP handle being used is CNG. This might be tricky to understand as we are using a CAPI2 provider.

The reason is: There are many places in CAPI2 code that will try to translate a CSP handle into a CNG handle. This is the way the code is implemented and the selection criteria is based either on the ALG_ID or on the nature of the provider. If ALG_ID indicates CNG OR if a CNG provider is used, the CNG flag is set.

By default for OID "1.2.840.113549.1.1.11", if we don't use a CNG provider and do not initialize the CRYPT_OID_INFO structure (that contains the ALG_ID attribute), it sets the values as shown below: 

 0:000> dx -r1 (*((CRYPT32!_CRYPT_OID_INFO *)0x768bc02c))

(*((CRYPT32!_CRYPT_OID_INFO*)0x768bc02c))       [Type: _CRYPT_OID_INFO]

    [+0x000] cbSize           : 0x24

    [+0x004] pszOID          : 0x768bc2e0 : "1.2.840.113549.1.1.11" [Type: char *]

    [+0x008] pwszName    : 0x768bc2cc : "sha256RSA" [Type: wchar_t *]

    [+0x00c] dwGroupId    : 0x4

    [+0x010] dwValue       : 0xffffffff

    [+0x010] Algid            : 0xffffffff

    [+0x010] dwLength     : 0xffffffff

    [+0x014] ExtraInfo        [Type: _CRYPTOAPI_BLOB]

    [+0x01c] pwszCNGAlgid    : 0x768bb83c : "SHA256" [Type: wchar_t *]

    [+0x020] pwszCNGExtraAlgid : 0x768bc53c : "RSA" [Type: wchar_t *]

 

0:000> dx -r1 (*((CRYPT32!_CRYPT_ALGORITHM_IDENTIFIER*)0x15f688))

(*((CRYPT32!_CRYPT_ALGORITHM_IDENTIFIER*)0x15f688))                [Type: _CRYPT_ALGORITHM_IDENTIFIER]

    [+0x000] pszObjId         : 0x1123424 : "1.2.840.113549.1.1.11" [Type: char *]

    [+0x004] Parameters       [Type: _CRYPTOAPI_BLOB]

 

For example, CryptSignCertificate() checks the ALG_ID and based on its value it sets the CNG flag. In the above case since Algid is 0xffffffff (CALG_OID_INFO_CNG_ONLY) so the CNG flag is set.

If we set the ALG_ID to say CALG_SHA_256, the CNG route is not followed. The issue here is not with the CAPI2 CSP, but it's the way the CAPI2 code is implemented.

Visual Studio Code - NEW FEATURE: Yo Code - Streamlined Customizations for VS Code

$
0
0

This feature was added in October in Visual Studio Code v0.9.1.

We re-purposed the Yeoman generator we used for the VS Code Node.js sample generator-code and we are now using this to make creating common customizations easier.

Tip: If you still want access to the sample, you can find out how on the Node.js page.

In this release of the generator, we have the ability to create two common customizations:

  1. Additional color themes
  2. Syntax highlighters/bracket matchers

In the future, we'll add other options for rich customization of VS Code.

If you have a TextMate color theme (.tmTheme) or a TextMate language specification (.tmLanguage), you can bring them into VS Code using the 'code' Yeoman generator.

Install and run the code Yeoman generator as follows:

  1. npm install -g yo
  2. npm install -g generator-code
  3. yo code

yo code

The Yeoman generator will walk you through creating your customization prompting for the required information. Once the generator is finished, copy the generator's output folder to a new folder under the.vscode/extensions folder and restart VS Code to use the new features.

In the future you will be able to publish these customizations to an external gallery to share them with the community.

Tip: If you want to share your customization with others in the meantime, you can simply send them a copy of the output from the generator and ask them to add it under their.vscode/extensions folder.

   

Check out all the new features and update to VSC v0.9.2:

https://code.visualstudio.com/updates

  

Have a great day!

   - Ninja Ed


Power BI Weekly Service Update

$
0
0

This week we focused on making your experience creating and sharing dashboards easier than ever. Being able to position tiles wherever you want, duplicate entire dashboards, and improved error messages all make it quicker and easier to create dashboards the way you want. Once your dashboards are just how you want them, improvements to groups and full screen mode make it easier to share your creations with those who matter most.

Duplicate Dashboards
Freely Position Dashboard Tiles
Improved Navigation for Full Screen View
Better Experience when Inviting Peers from your Organization to Power BI Groups
Improved Error Messages for Tiles

Duplicate Dashboards

Sometimes, you may want to create dashboards that have similar tiles for different groups of viewers. For example, you may have a dashboard for a manager that has tiles for all his direct reports and a dashboard for each of their reports with only the tiles that pertain to them. Previously you would have to make each dashboard individually, but now you can use an existing dashboard as a starting point, and just duplicate it as many times as you need.

To do this, start with the dashboard you want to duplicate. In this case, we are starting with a dashboard showing key metrics for a business, including metrics for each report.

 

To make a dashboard for each report, you just need to click the ‘…’ on the dashboard’s top right and select Duplicate Dashboard.

 

Give the duplicate dashboard a name.

 

And finally, modify the new dashboard as you see fit. In our case, we removed the tiles that pertained to the other reports and rearranged a few tiles.

 

Freely Position Dashboard Tiles

Previously, tiles always gravitated towards the top left corner of the dashboard unless there was another tile in the way. Now, you no longer are restricted to that layout when designing the layout of your dashboards. You have full control to put your tiles wherever you choose on the dashboard.

 

Improved Navigation for Full Screen View

To make presenting your dashboards and reports even easier, we have made some improvements to Full Screen Mode. Now, when you click through a dashboard to a report in full screen mode, you can easily rotate through the report pages.

You can move left and right within the report pages and back to the dashboard by using the controls on the bottom right that appear when you move your mouse.

 

Not using a mouse? You can also navigate with your keyboard! Use the left and right arrow keys to move back and forth, and the backspace to go back. The space bar toggles you fit to screen view and Esc exits you out of full screen mode.

Better Experience when Inviting Peers from your Organization to Power BI Groups

Power BI groups bring you and your colleagues together to collaborate, communicate and connect with your data. You can create a group either in Power BI or Office 365. Then, you can invite co-workers into this group workspace where you can collaborate on your organization’s shared dashboards, reports and datasets. You can read more about how to create Power BI Groups here.

Until now, if you wanted to invite peers from your organization to your Power BI group, they needed to be in Azure Active Directory (they are already a user of Office 365 or Power BI user). Now, you can invite peers from your organization to your group, even if they are not in Azure Active directory.

The experience to invite users not in AAD to Power BI Groups is the same as it is for users already in AAD. You can edit an existing Group and add a colleague to the group.

 

If the colleague is not a Power BI user and their identity is not AAD, Power BI sends an email invite to Nancy to join Power BI.

Once they sign up for Power BI to access the group, you will see them as a member of the group. 

Improved Error Messages for Tiles

On occasion, you may encounter errors on your tiles. These could be due to many different issues, such as a field name changing in a dataset. We want to help you get going again as soon as possible when these errors happen, so we improved the messaging you get when these errors happen.

 

You will be able to view details about the error, and if you need extra help, access a troubleshooting article in our documentation by clicking Get help.

We hope that you like these new features. Try them out and let us know what you think in the comments section below!

Last but not least, check our past weekly updates to stay up to date on the latest and greatest for the Power BI service.

Liens de téléchargement de l'événement: TechTalk: Windows 10 et Mobilité en Entreprise ITPro

$
0
0

A propos de l’événement 

Le 11 novembre 2015 chez Hotel Royal à Genève

Depuis son lancement le 29 juillet 2015 Windows 10 a connu une forte progression avec plus de 75 millions d’installations à ce jour. Pour les entreprises Windows 10 apporte un grand nombre d’améliorations, notamment dans les domaines de la sécurité, du déploiement, de la gestion des périphériques ou du développement d’applications.

Voulez-vous en savoir plus sur le sujet? 

Liens de téléchargement 


System Center Operations Manager PowerShell grid Widget, create an effective dashboard targeted to service owner/contact:

$
0
0

Hi There

Steve here, posting a blog entry for my colleague Arif the great OpsMgr PFE. He doesn't have access to our blog yet, but we'll rectify that soon. So posting some great material on his behalf.

Enjoy.

Steve
-------------------------------------------------

Visualization in System Center Operations Manager or SCOM in general: people knows SCOM as centralized place to get alerts, to get real time information of what happening with their Systems or Services. One-way that we really can help people express and share the understand of their data is by creating Visualization around that.

In this blog, I will show, one way of creating an effective dashboard in SCOM and bring in the data, not from SCOM databases, but from external source.

With the release of UR2 for System Center 2012 R2 and UR6 for System Center 2012 SP1, Microsoft included updates for the Widgets which allow you to create richer dashboard visualization within Operations Manager. This dashboard infrastructure consists of dashboard, widgets and the data source to pull the data from, either from Operations Manager or external data sources. More information about the new list of Widgets can be found here:  http://social.technet.microsoft.com/wiki/contents/articles/24133.operations-manager-dashboard-widgets.aspx.

Two of the more interesting new Widgets in my opinion are the PowerShell Grid Widgets and PowerShell Web Browser Widget, or you can say "Extensibility widgets":

  • These 2 widgets, are really powerful widgets, expends and customized to your heart contents. these widgets allow you to visualize what you want to show on your dashboard.

  • PowerShell Grid Widget, displays the result of a PowerShell Script in a grid. Enter your PowerShell script and data retuned in grid format inside the Operations Manager Console. As the data return from the PS are shown in grid, and sometimes we may have to scroll through using the bar, which are not a good option for a dashboard. However, when PS is the best or the only option to get data, PS Grid is the most effective way to create the dashboard;

lot of examples are available and community members have developed fantastic dashboards using PowerShell widgets and PowerShell web widgets, Browse to: http://aka.ms/scomdashboards.

Before I go to the dashboard demo, I would like to talk about the context or scenario, I will do the demo on.

In System Center we make heavy use of Business Services, because this is how we represent the IT deliverables to the business.

Think of it as something that IT delivers to the business in the way the business understands it.  Intranet, email, voice, warehouse automation would be examples of a service consumed by the business and delivered by IT. 

In System Center Service Manager and Operations Manager we leverage the concept of the services.  In Service Manager we articulate the service in our Business Service listing in the CMDB. 

In my example, I have System Center Service Manager (SCSM) 2012 R2 setup and have few business services defined. You can see several
individual services, each of which are listed with a different function, priority, etc.

As we open a service, such as Service Desk, you can see how it has included detailed contact, priority, service owner, contact person and
business information. It also provides the information to IT Analysts or service owner, who want to know the configuration of a business service.

And if we click on the “Service Components” tab, we can see the list of components this business service has. For this demo purpose, I am only limited the components to Computer Objects, however a business service can have any type of objects, e.g. physical storage, network device etc. and in the SCSM, we can add those type of service components.

Now, back to the SCOM, my object for this demo is to create a dashboard for service owners, bringing the defined information from SCSM. For
this, I will use SCOM dashboard PowerShell Grid widget.

Let’s start. In the SCOM Console:

1. Create a new Management Pack (MP) from the Administration tab. For my demo, I named it “Demo Dashboards”

2. Once you create the MP, a new folder will appear under your Monitoring tab. Add a new 4-Cells Grid Layout view to the
folder.

New -> Dashboard View -> Microsoft -> Grid Layout

3. Enter a name for the dashboard, for my example “SCSM Data’

4. Select 4-Cells and Create the dashboard.

Time to start populating the grid with data from SCSM.

In the first grid, I want to show, all defined business services from the SCSM. Click on Click to add widget in the first grid and select PowerShell Grid
Widget
and click Next.

5. Name the Widget. This name be shown on above the grid. I named mine to “Business Services”. Click Next

6. Copy and paste the PowerShell Script from the downloaded file, Get_Busniess_Services section at the bottom

7. The PowerShell Grid Widget will show the Business Services from SCSM

In the second Grid, I want to show all the components of the business service selected in the first grid. Follow the above procedure to add the PowerShell Grid Widget and copy the script from “Business_Services_Components” section. I named the Grid “Components”. Once done, if we select any business service in the first grid, the second grid will show all the related components of he selected business service.

 

8. Since SCSM is an ITSM process management tool and record the incident and I have configured integration between SCOM and SCSM, hence I have a SCSM incident for any alert for computer object in SCOM. In the third Grid, I want to show all the active incident from the SCSM for the
select server object in grid 2. Follow the same procedure to add the PowerShell Grid Widget and copy the script from “Active_Incidents” section. I named the Grid “Active Incident”. Once done, if we select any server object in the second grid, the third grid will show all the related active incidents from the SCSM for the selected server.

 

9. Next, in the last grid, I want to show all changes from SCSM related to selected server in grid 2. This will help service owner, to identify, if there is any change that has caused the incident related to selected server. Service owner can also see the upcoming changes for the selected server that will impact the business service, the server is a component of. Follow the same procedure to add the PowerShell Grid Widget and copy the script from “Changes” section. I named the Grid “Related Changes”.

If we select different business service in the first grid and different server component in the second server, we will see the related incident and changes in grid 3 and 4 respectively.

In this blog, we have seen, how we can use SCOM PowerShell grid widget to bring in the information from external data source to build an effective dashboard for service owner or service contact. Shown below is an example of the service oriented dashboard which can be target to the service owner. 

Happy OpsMgr'ing

Arif

PS here is the script referenced in the blog.

Meet X-Tag!

$
0
0

Cross-post from my blog http://code4word.com/ 
----------------------------------------------

Hello developers!

Just wanted to share with you a newly released "Awesome" JavaScript Library called X-Tag.


X-Tag is a Microsoft supported, open source, JavaScript library that wraps the W3C standard Web Components family of APIs to provide a compact, feature-rich interface for rapid component development. Now, while X-Tag offers feature hooks for all Web Component APIs (Custom Elements, Shadow DOM, Templates, and HTML Imports), it only requires Custom Element support to operate.
As a fallback mechanism, X-Tag also uses a set of polyfills shared with Google’s Polymer framework, in case there was no native support for the Custom Element.

 The X-Tag has a wide Browser Support:
Edge (all versions and devices) and Internet Explorer 9+
Firefox (all versions, devices, and platforms)
Chrome (all versions, devices, and platforms) and Android 4+ stock browser
Safari and Safari iOS 5+
Opera 11+ (all devices and platforms)

Just in case you missed out on Web Components and might be wondering what on earth is this?

Web components are encapsulated, reusable and composable widgets for the web platform. Basically it is a collection of standards being worked out at the W3C. In summary, web components allow you to bundle markup and styles into custom HTML elements. Its introduction was very significant to web development because it can harness the power and extensibility necessary to build some complex widgets and applications right into the core web feature set. So, go ahead and picture the capabilities of JavaScript libraries like Angular and Backbone, but as a foundation of the web platform, standardised across all browsers.

Cheers!

Using OAuth2 with SOAP

$
0
0

I started at Microsoft when SOAP was all the rage, before there was such a thing as WCF. So it is with some nostalgia that I tried to combine one of latest technologies: Universal App Platform (UAP) with SOAP using OAuth2 protocol for authentication. One possible application of this approach would be for folks who are slowly migrating from WCF to REST and/or needing to mix REST and SOAP services in their current applications. Although the sample uses a UWP client, same code would apply to other .NET clients, e.g. WPF needing to use OAuth2 for authenticating to a SOAP service.

The basics of the attached samples are as follows:

  1. It is using Azure AD to provide the authentication service and therefore an OAuth2 access token to a UAP client.
  2. The client is using the ADAL library to acquire the token (see https://github.com/Azure-Samples/active-directory-dotnet-windows-store for more details).
  3. The SOAP service is a web hosted WCF service.
There are two main choices for passing authentication data to a SOAP service: via a custom SOAP header or, if the service is using http (as in my case) via an http header, e.g. the Authorization header. The first approach is more generic in that it could support non-http WCF services. The 2nd is likely faster (no xml parsing). Since UWP only supports basicHttpBinding, unless the service also supports other, non UWP clients, using non-http bindings, the http header approach is just as good. The attached sample uses the custom header approach but includes commented code for the http-header approach as well.
To call the SOAP service, after regular service proxy was added to the UWP client and an OAuth2 token was obtained, the client does the following to create and include a custom SOAP header with the OAuth2 access token:
 var svc = new ServiceReference1.Service1Client();
using (var scope = new OperationContextScope(svc.InnerChannel))
{
var authHeader = MessageHeader.CreateHeader("Token", "http://tempuri.org", result.AccessToken);
OperationContext.Current.OutgoingMessageHeaders.Add(authHeader);
// Do this if you want to use http header instead
//var httpRequestProperty = new HttpRequestMessageProperty();
//httpRequestProperty.Headers[System.Net.HttpRequestHeader.Authorization] = result.AccessToken;
//OperationContext.Current.OutgoingMessageProperties[HttpRequestMessageProperty.Name] = httpRequestProperty;
var res = await svc.GetDataAsync(inp);
_result.Text = res.ToString();
}
The service, which unfortunately cannot use the OWIN stack to validate the JWT token, implements its own ServiceAuthorizationManager-derived class and a JWT-validator. The validator (called AADJWTValidator in the sample) relies on the Azure AD federation metadata endpoint to retrieve the signing key.
That simple!

AX 2012 - Balanza de comprobación y Cierre de ejercicio

$
0
0

Al terminar de ejecutarse el proceso Periódico ‘Transacciones de apertura’ (Contabilidad general > Periódico > Cierre de año fiscal), como parte del procedimiento del Cierre de año contable, las cuentas contables son actualizadas con el registro de las transacciones de Cierre y Apertura. La actualización de la contabilidad puede consultarse en la Balanza de comprobación (Contabilidad general > Común > Balanza de comprobación), o bien, en el Resumen de la Balanza de comprobación (Contabilidad General > Reportes > Transacciones).

 

Internamente al terminar de ejecutar el proceso periódico ‘Transacciones de apertura’, AX ejecuta automáticamente una ‘Tarea en lote’ (Administración del sistema > Consultas > Tareas por lotes) denominada ‘Volver a elaborar saldos’. Una vez que esta ‘Tarea por lote’ termina, la Balanza de comprobación es actualizada.

Esta ‘Tarea por lote’ ha sido asignada al ‘Conjunto de lotes’ en blanco (Empty batch group) (Administración del sistema > Configurar) [Ver imagen 1]

 

 

Imagen 1 – Conjunto de lotes

 

En algunas ocasiones puede ocurrir que el saldo que muestra la Balanza de comprobación no coincide con el monto del detalle de las transacciones registradas, lo cual puede ocurrir por alguno de los siguientes escenarios:

 

  1. La Balanza de comprobación se consulta antes de que termine la tarea de actualización. Para lo cual, se puede volver a abrir la forma de la Balanza de comprobación, o simplemente actualizarla.
  2. El ‘Conjunto de lotes’ podría no tener asignado un Servidor que realice la ‘Tarea por lotes’. Por lo cual, se puede asignar un Servidor para ese lote.
  3. La ‘Tarea por lote’ tiene un Estado diferente a ‘Terminado’, indicando que no ha finalizado o que ha finalizado con error. Por lo cual, se tendría que analizar el tipo de error que pudiera existir en el Visor de eventos.

 

Adicionalmente a las opciones de solución presentadas en los puntos anteriores, es posible ‘Volver a elaborar saldos’ o ‘Actualizar saldos’ del Conjunto de dimensiones consultado (Contabilidad general > Configurar > Dimensiones financieras > Conjuntos de dimensiones financieras) en la Balanza de comprobación.

 

Como resultado, en los casos anteriores, el saldo coincidirá entre la Balanza de comprobación y el detalle de las transacciones registradas una vez que ha terminado el proceso ‘Transacciones de apertura’.

 

 

Para R

Microsoft Dynamics AX – built for the cloud

$
0
0

Overnight we released further details about the newest release of our enterprise resource planning solution Microsoft Dynamics AX (previously known as “AX7”) including general availability in the first quarter of calendar year 2016.  A public preview will be available for both Microsoft Partners and customers in early December. Hear some feedback from other Microsoft Partners.

The official announcement and blog posts from Mike Ehrenberg,  and Christian Pedersen provide full details about this exciting new release including:

  • New user experience– looks and works like Microsoft Office with deep integration into Dynamics AX, Dynamics CRM and Office 365.  Near-real-time analytics powered by Azure Machine Learning, data visualisation with Power BI embedded within the application, fully browser-based, HTML5 client. 
  • Enhanced implementation & management– enhanced capabilities of Azure-based service Microsoft Life Cycle Services (LCS) transforming the application lifecycle management of a Dynamics AX project.
  • Power of the cloud– on the Microsoft Azure platform this release brings the benefits of the cloud - simple signup, immediate provisioning, and built-in high availability and disaster recovery, elastic capacity to add resources but only pay for them when they are being used.

For further details visit www.microsoft.com/en-au/dynamics/erp-ax-overview.aspx.

Stop Hoping for Quality and Just Test It!

$
0
0

As I continue to apply more engineering rigor to the release process in my team, I hear statements referring to engineers being hopeful and hoping things will go well.  Hoping is not the correct way to ship software.  I also hear a lot of statements like “we are confident this will work”.  Confidence, although great to have as an engineer, cannot solely be the main indicator to release high quality software.  What you need to ship high quality software is testing.  It’s having the data that shows you ran appropriate tests and validated not only that your software works when it should, but that it works correctly when it shouldn't, when you take an erroneous path through it, and it fails gracefully if necessary.  My team has incorporated an extra check to make sure we truly are ready to ship our software when we think we are.  We call these extra checkpoints Release Review meetings or Go/No-Go meetings.  Think twice before saying you are confident because it may come across like you are trying to sell the fact that the software is ready to release.  But this is not the place for a sales pitch.  The people needing to give the positive votes in a Release Review meeting don’t just need a statement of confident.  Along with it, they need to see the data to back that up, that proves all the correct items were tested and that proves the software works as expected.  I see many confident and hopeful software engineers working late nights and weekends because their confidence and hope was short-lived and inappropriately placed.  Please don't be one of them.
 
When we were all learning how to program and how computers work, one of the first things you learn is that the computer, and specifically the software, only does what you tell it to do.  If you incorrectly tell it to do something, it will.  Software can’t figure out your intentions.  It doesn’t have a mind of its own and it doesn’t think “hey, I’m betting my programmer really wanted to do this and not that”.  (Although some day it would be great if it could!)  Hoping your software does something is a programmer's way of assuming the software understands his or her intentions.  The only way to know if your software is doing what you want it to do is to test it.  You can’t hope and you can’t push your confidence as a programmer at it.  You can only test it.  Walk through the customer scenarios.  Walk through the places where things could fail and make sure it fails gracefully or recovers without failing.  Figure out what you are missing, where your program can fail that you haven’t considered, and then test that and see what actually happens.
 
Where is your program going to fail?  You should always ask that question.  And if your software is large and complex that can be a hard question to answer.  So consider asking yourself where are you taking risks?  And what I mean by this is where are the areas in your software:

  • That have dependencies on code outside what your team owns
  • That have some code that is unstable and is known to produce a lot of defects
  • That have some code that is a bit unknown due to it being legacy software, written by people who no longer work on the team and didn’t comment it well
  • That have some code that is written in a complex way that makes it difficult to understand
  • That don't have enough testing coverage
  • That when released, there is no way to rollback or fix forward your changes if problems occur

Communicating risks is hugely important in understanding the state of your software.  Understanding the risks early leads to people taking action to mitigate them and that leads to better software overall.  Communicating risks within your code is not a sign of weakness.  It's a sign that you understand all aspects of your very complex software system and you have the confidence as an engineer to state where the gaps are.  Good software engineers know how to test their code for quality and how to communicate the risks and gaps in their software ecosystem.

If you have read this far, I'm going to assume this topic interests you so let me ask you to do a little assignment.  When you are at work, look at your feature, user story, or overall ecosystem and come up with 3 risky areas where your software may fail.  Rate the areas as high, medium, or low.  And then determine the best way to mitigate those risks.  Do you need to refactor your code, remove your unnecessary dependencies, or do more testing?  Now go talk to a coworker about the risky areas you uncovered and post them your thoughts in the comment section of this blog.  Is it difficult to find 3?  Is it difficult to determine what a risky area in your software is?  Did you find this exercise easy?  I'd love to hear from you.  Please add a comment letting me know if this assignment was helpful.

Free edX Power BI course - starts next week

$
0
0

imageThere’s a brand new free edX course starting next week - “Analysing and Visualising Data with Power BI”, taught by Will Thompson. It is the perfect opportunity to learn how you can start to build an education analytics system for your school\TAFE\university. I am a massive fan of using Power BI in education, because of the potential it offers to unlock institutional data and put the power  of analysis into the hands of users (rather than conventional BI systems that have concentrated on putting ‘reports’ into users hands). I’ve found that an absolute beginner can produce powerful visualisations of their data in a few hours, and that an advanced user needs the same level of skills that an advanced Excel user has. Whilst there are plenty of Microsoft partners who have, or can, build educational analytics solutions customers using Power BI in education, there’s also an opportunity to create your own systems - and that’s a great way to use this course.

If you want to see what’s possible with Power BI, then watch the video below to see what I created with data from Queensland’s Open Education Data site in a couple of days (more details on how I made it here)

If you’ve got some spreadsheets of data, or data trapped across a set of different databases, then this course would be a good way to learn how to unlock it to create new ways of telling a data story.

About this course

In this course, you will learn how to connect, explore, and visualize data with Power BI.

Power BI is a cloud-based business analytics service that helps create live operational dashboards from on-premises and cloud data in one central location that you can access across a range of devices.

Power BI Desktop provides a free-form canvas for drag-and-drop data exploration as well as an extensive library of interactive visualizations, simple report creation, and fast publishing to the Power BI service.

This course is taught in short, lecture-based videos, complete with demos, quizzes, and hands-on labs. The Power BI product team will guide you through Power BI end-to-end, starting from how to connect to and import your data, author reports using Power BI Desktop, publish those reports to the Power BI service, create dashboards, and share to business users so that they can consume the dashboards through the web and their mobile devices.

What you'll learn
  • Connecting, importing, shaping and transforming data for business intelligence
  • Visualizing data, authoring reports, and scheduling automated refresh of your reports
  • Creating and sharing dashboards based on reports in Power BI desktop and Excel
  • Using natural language queries
  • Creating real time dashboards

Course details

imageThe free edX course starts online on November 24, and runs for 4 weeks. The instructors estimate it will involve 2-4 hours per week, and then at the end you’ll get an Honor Certificate from edX that you can add to your resume, or share on Linked (or you can sign up for a Verified Certificate for $49)

Make a dateMake a date: Find out more, and register for the free edX course


Azure Deep Dive 大阪 ~エバンジェリスト集結!Azure Conf の内容も含んで合計12セッション ~

$
0
0

 

大阪でAzureのイベントを開催します。

申込みウェブサイト:http://aka.ms/mad2015

2015年12月15日 (火) に、「Azure Deep Dive ~より広く、より深く知るための壱日~」のイベントを開催する運びとなりましたのでご案内させていただきます。

クラウド時代においては、新しい技術を継続して身に着けていくことが非常に重要になりつつあります。 Azure Deep Diveは、Microsoft Azureをご利用のエンジニア様を対象に、より広く、より深く、そして最新の情報をご理解いただくことで、さらにMicrosoft Azureを有効にご活用いただくための特別なイベントです。1日を通して2つのトラック、合計12のセッションをご用意しており、Microsoft Azureの理想的な構成/管理、開発、データ分析などの技術的知識を身に着けることができます。

日時:2015年12月15日 (火) 10:00 – 18:30 (受付開始 9:30~、懇親会 17:30-18:30予定)
※終了後 関西支店セミナールーム2で懇親会を開催いたします。懇親会では各セッションのスピーカーも参加していますので、Azureについて直接お話を頂く事ができます。是非ご参加頂けますように願いいたします。

会場:日本マイクロソフト関西支店 5F セミナールーム 1および2
〒553-0003 大阪府大阪市福島区福島 5 丁目 6 番地 16 ラグザタワー ノースオフィス 2F 受付
http://www.microsoft.com/ja-jp/mscorp/branch/kansai.aspx

参加対象:ISVまたはクラウドパートナー企業の開発者、運用管理者、アーキテクト

当日のスケジュール:http://aka.ms/mad-agenda

 

セッション タイトル

概要文

スピーカー

Azure 最新アップデート
- 今年のアップデートを中心に-

Microsoft Azure は1年間で500以上もの機能が追加される"進化の早い"クラウドプラットフォームです。ただし、その早さゆえに全ての新機能を把握することが難しく、「今どんな機能があるの?」「結局どういうふうに活用できるの?」という疑問が起こりがちです。そんな疑問をお持ちの方は、本セッションにご参加ください。今年のアップデートを中心に「今どんな機能があるか」を把握しつつ、「自社でどのように活用できるのか」を例を用いて紹介します

大田昌幸

Azure 概要
~ 基礎から始めるクラウド活用 ~

マイクロソフトが提供するクラウドサービスである Microsoft Azure の全体像と概要を説明します。Azure とは何なのか、Azure で何ができるのか、Azure を使うと何が良いのかを知りたい方は本セッションにご参加ください。

田中達彦

既存アプリケーションをクラウド対応させるためのファースト ステップ ~ Azure Virtual Machine 実践 ~

Azure Virtual Machine を���えばオンプレミスの環境をそのままクラウドに持ち込んで動作させることは容易です。ただし、オンプレミスと異なり様々な注意点が存在するのも事実です。このセッションではAzureへの移行の際に注意すべき仮想マシンのインスタンスの特徴や展開方法等を解説します。

井上大輔

業務データを手元に残さないAzure RemoteApp ~VDI の一部をクラウド化してコスト最適化提案を~

VDI(Virtual Desktop Infrastructure)は、業務系ITにおけるセキュリティのベースとして、モバイルにおけるデータ漏えい対策として、多くの企業が導入する一般的なソリューションの1つです。しかし、導入コストやアップデートコストという大きな課題を抱えているため、クラウドに期待が集まり始めています。そこで本セッションでは、アプリケーションのリモート実行というシンプルな要件に対応可能なAzure RemoteAppサービスについて解説します。

畠山大有

Azureネットワーキングを理解しよう!

Azure Virtual Machinesを使ったシステムを構築する際に、複数のVMの間の通信、インターネット側からの通信などのネットワーキングは欠かせません。本セッションでは、Virtual NetworkやLoad Balancer、Network Security Group、加えて最近の新機能であるApplication Gateway、Azure DNSといったネットワーキング関連の機能をご紹介します。

井上大輔

Azure SQL Database のスケーラビリティと性能 && SQL Data Warehouse

Azure SQL Databaseの性能指標を表すDTU(Database Throughput Unit)の特徴やスケーラビリティに関連する機能をご紹介します。
また、従来はコストや技術的なハードルで導入が難しかったDWHですが、新しく登場したSQL Data Warehouseを利用することで誰でも簡単に導入できるようになります。ここではプレビュー公開中のSQL Data Warehouseについても概要をご説明します。

井上大輔

Power BIで広がる一般社員のデータ分析と業務の最適化

今後あらゆる業務はデータドリブン(データの取得、分析し最適化する、それを次のアクションに活かす)で進めることになると言われています。そしてそのデータ活用は、専門のスタッフが分析するのではなく現場の社員が自らデータ分析し、その結果を業務知見を持って仮設検証ができることが重要であると考えます。今回専門の技術者ではなく、マイクソフトの一般社員がPower BIの使いやすさや機能をエンドユーザーの視点で分かり易くお伝えします。

川岡誠司

Azureで実現する機械学習 - クラスタリングの理解と実践 -

蓄積されたデータを将来の意思決定に役立てるための手法として機械学習に注目が集まっています。本セッションでは「機械学習とは何か/機械学習は自社ビジネスにどう活用できるのか?」という概要をご理解いただきます。そのうえで、活用の一例として「顧客の分類」をテーマとさせていただき、これを実現するための手法である「クラスタリング」を図を用いて分かりやすくお伝えします。手法についての理解を深めたうえで「実際に機械学習を自社ビジネスで活用するため」のツールとして、Azure の提供するデータープラットフォームをデモを交えて紹介します。

大田昌幸

Hadoop と Datalake による分散ストレージ・分析プラットフォームの今とこれから

容量の大きなデータ処理・解析は、RDBMでもかなり対応できるようになってきました。近年はnoSQLでのアプローチも成熟化してきており、中でもHadoopがコア技術としてエコシステムを作ってきています。
このセッションでは、Azureの中でのHadoopの実装。そして、次のBig DataのエンジンとなるAzure Data Lakeを取り上げます。特に適用のシナリオを中心にご紹介します。

畠山大有

Azure App Services Web Apps による Web アプリケーションの開発

Azure App Servic Web Apps は、インターネット向けの Web アプリケーションのみならず、エンタープライズ向けの Web アプリケーションでも活用できるプラットフォームです。本セッションでは、Azure App Service Web Apps を活用したアプリケーションの開発関連の情報について説明します。

田中達彦

Visual Studio Online + Application Insights 概要
~ 開発基盤のクラウド化 と継続的な価値の提供 ~

本セッションではクラウド化された開発基盤である Visual Studio Online について、全体像とその機能を解説します。また、Web アプリとモバイル アプリで問題の検出、クラッシュの診断、利用状況の追跡を行うクラウド サービスである Application Insights の概要と、Visual Studio Online との連携についても、ご紹介します。

田中達彦

Azure Active Directoryの今 - Identity as a Services -

企業・組織の中で、認証サービス・端末管理のために使われてきたActive Directory。それが、Azureの中でサービスとして利用できるようになっています。このセッションでは、サービスとしてのAzure Active Directoryで何ができるようになっているのか?そして、多様化するデバイス、セキュリティ関係の中で、「認証」サービスの持つ可能性を考えるための材料をご紹介します。

畠山大有

Investigating issues with Cloud Load Test in Visual Studio Team Services - 11/20 - Resolved

$
0
0

Final Update: Friday, 20 November 2015 06:42 UTC

We’ve confirmed that all systems are back to normal with no customer impact as of 11/20/215 06: UTC. Our logs show the incident started on 11/20/215 06: UTC.

Root Cause: Combination of unexpected spike in usage and service issue caused this issue.

Chance of Reoccurrence: We have taken necessary steps to avoid further occurrence of such issue.

We understand that customers rely on VS Online as a critical service and apologize for any impact this incident caused.

Sincerely,
VS Online Service Delivery Team


Initial Update: Friday, 20 November 2015 04:00 UTC

We are actively investigating issues with Cloud Load Test Service. Some customers in East US region may experience error while executing load test runs.

We are working to resolve this issue and apologize for any inconvenience.

Sincerely,
VS Online Service Delivery Team

 

MySQL Database on Azure - 利用PowerShell快速创建使用数据库服务

$
0
0

如果您的应用需要通过脚本快速创建一个或多个MySQL数据库,MySQL Database on Azure目前开放了对于PowerShell的支持,可以帮助您快速利用自动化脚本进行数据库服务的创建、管理等操作。您在Windows Azure管理门户上进行的操作,基本都可以通过PowerShell来执行。

只需以下九个简单步骤,您可以迅速通过脚本创建并使用MySQL。

步骤1安装配置Azure PowerShell

运行脚本前,您需要安装并运行Azure PowerShell。您可以通过运行Microsft Web平台安装程序下载并安装最新版本Azure PowerShell 。可参阅如何安装和配置Azure PowerShell来了解更多详细步骤。 用于创建和管理MySQL Database on Azure 数据库的cmdlet位于Azure资源管理器模块中。启动Azure PowerShell时,默认情况下将导入Azure模块中的cmdlet。若要切换到Azure资源管理器模块,请使用以下命令转换:

Switch-AzureMode -Name AzureResourceManager

步骤2配置账户信息

在针对Azure订阅运行PowerShell之前,必须先将Azure账户绑定。运行以下命令,在登陆页面输入与Azure管理门户相同的电子邮件和密码,进行身份验证。

Add-AzureAccount -Environment AzureChinaCloud

步骤3订阅MySQLDatabase on Azure服务

运行以下命令订阅MySQL服务。

Register-AzureProvider -ProviderNamespace "Microsoft.MySql"

步骤4创建资源组

如果您已有资源组,可以直接创建服务器,或者编辑运行以下命令,创建新的资源组, 用户可自定义资源组的名称,以”resourcegroupChinaEast”为例:

New-AzureResourceGroup -Name "resourcegroupChinaEast" -Location "chinaeast"

步骤5创建服务器

编辑运行以下命令,定义您的服务器名称、位置、版本等信息来完成服务器创建,以下命令以服务器名称”testPSH”为例:

New-AzureResource -ResourceType "Microsoft.MySql/servers" -ResourceName testPSH -ApiVersion 2015-09-01 -ResourceGroupName resourcegroupChinaEast -Location chinaeast -PropertyObject @{version = '5.5'}

注:目前暂不支持通过PowerShell更改SKU, 创建服务器缺省的SKU为”MS2”。如需调整SKU请在Azure管理门户上进行修改。

步骤6创建服务器防火墙原则

编辑运行以下命令,定义您的防火墙原则名称、IP白名单范围(起始IP地址,终止IP地址)等信息来完成防火墙原则的创建。以防火墙原则名称”rule1”为例:

New-AzureResource -ResourceType "Microsoft.MySql/servers/firewallRules" -ResourceName testPSH/rule1 -ApiVersion 2015-09-01 -PropertyObject @{startIpAddress="0.0.0.0"; endIpAddress="255.255.255.255"} -ResourceGroupName resourcegroupChinaEast

步骤7创建数据库

编辑运行以下命令,定义您的数据库名称、字符集等信息完成数据库创建。以数据库名称”demodb”为例:

New-AzureResource -ResourceType "Microsoft.MySql/servers/databases" -ResourceName testPSH/demodb -ApiVersion 2015-09-01 -ResourceGroupName resourcegroupChinaEast -PropertyObject @{collation='utf8_general_ci'; charset='utf8'}

步骤8创建用户

编辑运行以下命令,定义您的用户名、密码等信息完成数据库创建。以用户名”admin”为例:

New-AzureResource -ResourceType "Microsoft.MySql/servers/users" -ResourceName testPSH/admin -ApiVersion 2015-09-01 -ResourceGroupName resourcegroupChinaEast -PropertyObject @{password='abc123'}

步骤9添加用户权限

编辑运行以下命令,设置数据库读写权限给用户。权限分为"Read"以及"ReadWrite"。以用户名”admin”为例:

New-AzureResource -ResourceType "Microsoft.MySql/servers/databases/privileges" -ResourceName testPSH/demodb/admin -ApiVersion 2015-09-01 -ResourceGroupName resourcegroupChinaEast -PropertyObject @{level='ReadWrite'}

通过上述操作,您已经完成了服务器、数据库、用户、防火墙原则等的创建工作,可以开始使用MySQL Database on Azure的数据库服务。详细的文档可以查看利用Azure资源管理器与PowerShell来部署使用MySQL Database on Azure。在使用过程中,如需更多创建、查看、删除、更改的操作,您可以查看使用PowerShell管理MySQL Database on Azure。最后,也希望您持续关注我们MySQL Database on Azure在门户网站的相关信息,陆续我们会推出更多功能相关文档、常见问题解答等。

联系我们

MySQL Database on Azure研发团队非常期待您对产品的反馈,请在Windows Azure官网产品反馈专区填写您对产品的意见与建议。

也欢迎您扫描关注我们微信公众号”MySqlOnAzure”。我们会定期推送有关产品的最新讯息。

Visual Studio Code - NEW FEATURE: VS Code Supports Extensions!

$
0
0

This feature was added in November in Visual Studio Code v0.10.1.

With this release, we mark our official Beta milestone and the big news is that VS Code now supports extensions (plug-ins) and is open source!

VS Code Supports Extensions!

VS Code has great features out of the box but now you and the community can extend VS Code to add new features and languages.

marketplace

Find and install cool extensions by searching VS Code's public extension gallery.  There you'll find new themes, snippets, languages and tools.

extension language

VS Code has two new Extensions commands (F1 then 'ext inst') to let you find and install new extensions and manage (update, uninstall) your currently installed extensions.

extension commands

The VS Code also has a Marketplace where you can browse and learn more about extensions.

Extensibility SDK

If you don't find an existing extension that meets your development needs, you can create your own.  We've added extensive documentation on how to extend VS Code and a full extensibility API reference.  In addition, we provide the tools to you need to create and publish extensions.

If you'd like to dive right in, you can start with our "Hello World" walkthrough where you'll have a VS Code extension up and running in a matter of minutes.

Yo Code Extension Scaffolding

We've updated the yo code generator to create a basic extension project (TypeScript or JavaScript) which has all the metadata and source files necessary for a working extension.

yo code

Extension Publishing

The vsce publishing tool lets you easily package and publish your extension.  You can share your extension with colleagues by distributing a VS Code extension package or publish your extension for the community on the public gallery.

Extension samples

If you'd rather start your extension by modifying a working example, you can find extension samples as well as the source code for many extensions on GitHub (Go Language Support).

 

Check out all the new features and update to VSC v0.10.1:

https://code.visualstudio.com/updates

  

Have a great day!

   - Ninja Ed


Бесплатные вебинары на "Неделе стартапов" – подключайтесь!

$
0
0

Мы собрали самые интересные стартапы на платформе Microsoft в России, которые готовы рассказать об использовании облака на основе собственного опыта.

Узнать об их решениях, попробовать продукты и задать вопросы вы сможете на неделе стартапов с 23 по 27 ноября. Для участия необходима регистрация.

Среди участников недели вебинаров: Jelastic, Teslawatch, Actionspace, DataBoom, CloudStats, LastBackend и Addreality. Будет интересно, присоединяйтесь!

Viewing all 12366 articles
Browse latest View live




Latest Images