Quantcast
Channel: Mavention
Viewing all 715 articles
Browse latest View live

How to make a toggle switch with css

$
0
0

Style-facts part6

In the last chapter of the style facts we learned how to use borders and shadows. This time we learn how to put this to practice, by making our own toggle switch from a checkbox and label. Finally something we can use in a real corporate environment.

Most of us do not know what a default check input looks like. That is probably ok, because these things are ugly. We could make a fancy and nice looking checkbox, but let’s skip that altogether and make a shiny hip toggle switch. Those are way cooler than checkboxes. The elements we need for this are a checkbox input and a label tag.

‘For’ the win

Visually we only need the label. We stated earlier that the default checkbox was ugly, so we don’t need that. We only needs its functionality. So How do we transfer that to the label? To do this we add an ‘id’-attribute to the input and a ‘for’-attribute to the label, taking the id from the checkbox input as input. Now we can click the label and the checkbox is getting checked and unchecked! Wonderful.

<input type="checkbox" id="switch"/>
<label for="switch"></label>

Sibling combinators and labels in the switch

So how can we make the label do ‘on’ and ‘off’ like a switch? Part of the answer is in the title of this subsection.
We will use the direct sibling combinator to check if the input is checked and adapt the label styling accordingly.
The selectors we will use look something like this:

.switch { // off }

#switch:checked + .switch { // on }

We will use the background the base of our toggle button that changes color. If the input is ‘:checked’, the background color will be green. Like a backlight. In default state it will be gray, like it is disabled. We will use a transition to make the light green color come up.

label unchecked / checked

label checked

Toggling the switch

A before pseudo-element will be our toggling pin. Its will move from left to right when the input is checked. To smooth this out we put a transition on the ‘left’ of the before. And the background The after pseudo-element will make our on and off label. We hardcode the text for now, we’ll come back to that later.

toggle switch unchecked / checked

toggle switch unchecked / checked

Our toggle button is now toggling!!
We probably want dynamic text we can adapt to language in our styling. Let’s clean that up by incorporating some ‘on’ and ‘off’ data-labels. So we can put translation bindings in this. For this we use attributes ‘data-on’ and ‘data-off’ on the label. Now we can replace content: ‘on’ by content: attr(data-on); and replace content: ‘off’ by content: attr(data-off); Our toggle button is now finished! Here’s the codepen.

toggle switch off / on

toggle switch off / on

Hopefully you will use this switch styling to switch from old checkboxes to fancy switches.  So from now you can:

toggle toggle

And as always leave comments below for questions and remarks!

The next blog will be in the new year 🙂

Happy Hollidays!

 

 

Finally got some reading time? Read the rest of the series here:

Part5: https://www.mavention.nl/techblog/borders-and-shadows-make-design-magic

Part 4: https://www.mavention.nl/techblog/how-to-make-a-responsive-grid/

Part 3: https://www.mavention.nl/blogs-cat/3-levels-of-css-selectors-to-select-all-elements/

Part2: https://www.mavention.nl/blogs-cat/shaping-elements-and-its-surrounding/

Part1: https://www.mavention.nl/blogs-cat/2-basics-you-need-to-know-to-position-elements/

 

The post How to make a toggle switch with css appeared first on Mavention.


SPFx: dynamic property panes

$
0
0

With the property pane we can defer from hard coding configurations into web parts. Like references to lists, databases, but also defining text labels or custom colors and styling. Those choices can be made when deploying the web part and always be changed very easily.

This blog is not an introduction in how SPFx web part work or the property pane for that matter. I just want to share two cool ideas we that used in a recent project to create dynamic property panes for your SPFx solutions.

One solution allows the user to decide whether he wants to use custom colors for the webpart. If the user wants to use custom colors, the property pane will be extended so the user can pick those custom colors. The second one creates options based on a term set that will be used in the web part.

With both examples I will show and explain the key points. I have uploaded the solution to GitHub so you can check the full solution there.
Let’s start with the custom colors. I have setup a simple web part that takes some colors based on the theme of the site:

As you can see there is a toggle ‘Use custom colors’. If you used, you will get two color pickers in your property pane that allow for custom color selection:

You can do this with the use of the ‘getPropertyPaneConfiguration’ method of your webpart (located in the webpart.ts file). This method ‘creates’ your property pane. I edited mine, so it looked this:

protected getPropertyPaneConfiguration(): IPropertyPaneConfiguration { return { pages: [ { groups: [ { groupName: 'Styling configuration', groupFields: [ PropertyFieldToggleWithCallout('customColorsEnabled', { calloutTrigger: CalloutTriggers.Hover, key: 'toggleCustomColors', label: 'Use custom colors', calloutContent: React.createElement('p', {}, 'Switch this to use custom styling'), onText: 'ON', offText: 'OFF', checked: this.properties.customColorsEnabled }), ...this.customColorPickers(this.properties.customColorsEnabled) ] } ] } ] }; }

First, we have a toggle control. This control the customColorsEnabled property. But the second property control in the group isn’t a control but a function: ‘customColorPickers’. This function looks like this:

private customColorPickers(customColorsEnabled: boolean): IPropertyPaneField<IPropertyFieldColorPickerProps>[] { if (customColorsEnabled) { return [ PropertyFieldColorPicker('backgroundColor', { label: 'Background color', selectedColor: this.properties.backgroundColor, onPropertyChange: this.onPropertyPaneFieldChanged, properties: this.properties, disabled: false, alphaSliderHidden: false, style: PropertyFieldColorPickerStyle.Full, iconName: 'Precipitation', key: 'backgroundColorFieldId' }), PropertyFieldColorPicker('fontColor', { label: 'Font color', selectedColor: this.properties.fontColor, onPropertyChange: this.onPropertyPaneFieldChanged, properties: this.properties, disabled: false, alphaSliderHidden: false, style: PropertyFieldColorPickerStyle.Full, iconName: 'Precipitation', key: 'fontColorFieldId' }) ]; } else { return []; } }

The SPFx framework will render the ‘getPropertyPaneConfiguration’ every time some property changes. If we then flick the switch and are using custom colors the ‘customColorPickers’ method will be called again. Depending on whether we are using custom colors in that state will then either return more controls or just an empty array. This way the property pane is dependent on choices we make in that same property pane.

Next is building your property pane based on a term set. The use case for us was a web part that searched and displayed content. This content had a managed metadata field that functioned as a label. We wanted to be able to configure the color for each label. So we wanted a color picker for each term that we used.
Let me quickly show you a (simplified) example:

In the property pane the first field is a termpicker. With this term picker we select the term set that provides the labels. This term set holds 3 labels. For each of the labels a color picker is created. As you can see the web part then lists the three terms with the color we set.
The ‘getPropertyPaneConfiguration’ method looks like this:

protected getPropertyPaneConfiguration(): IPropertyPaneConfiguration { let labelsPropertiesGroup: IPropertyPaneGroup = { groupName: 'Content labels', groupFields: [ PropertyFieldTermPicker('termSet', { allowMultipleSelections: false, context: this.context, excludeSystemGroup: true, initialValues: this.properties.termSet, isTermSetSelectable: true, key: 'termSetsPickerFieldId', label: 'Term set', onPropertyChange: (propertyPath: string, oldValue: any, newValue: any) => this.changeLabelTermSet(newValue), panelTitle: 'Select a term set for labels', properties: this.properties, }) ] }; if (this.properties.termSetSelected) { let colorPickers = this.constructColorPickers(); if (colorPickers) { colorPickers.forEach(colorPicker => { labelsPropertiesGroup.groupFields.push(colorPicker); }); } } return { pages: [ { groups: [ labelsPropertiesGroup ] } ] }; }

What we do is create a group of fields with just the term picker in there to select the term set and store this group in a variable. We check if there is a term selected and if that is the case, we construct the color pickers. These color pickers are added to the group that is stored in the variable. That variable is then returned as the outcome of the method that provides the property pane.
The construct color pickers method:

private constructColorPickers(): IPropertyPaneField<IPropertyFieldColorPickerProps>[] { if (this.properties.termSetSelected) { if (this.properties.terms !== undefined && this.properties.terms !== null) { let colorPickers: IPropertyPaneField<IPropertyFieldColorPickerProps>[] = []; this.properties.terms.forEach(term => { colorPickers.push(PropertyFieldColorPickerMini(term.id, { label: term.name, initialColor: term.color, onPropertyChange: (propertyPath: string, oldValue: string, newValue: string) => this.updateLabelColorField(term.id, newValue), render: this.render.bind(this), disableReactivePropertyChanges: this.disableReactivePropertyChanges, properties: this.properties, key: term.id })); }); return colorPickers; } else { this.retrieveTermSet(this.properties.termSetId); return []; } } else { return []; } }

Because we are dealing with a couple of async operations you must be careful. check if a term set is selected, but also if the terms are already retrieved. If that is not the case, we will retrieve the term set first and return an empty array for the color pickers (for now). Retrieving the term set can be done with the help of PnP:

private retrieveTermSet(termSetId: string) { return new Promise<any>((resolve: () => void, reject: (error: any) => void): void => { const Taxonomy: Session = new Session(this.context.pageContext.site.absoluteUrl); const TermStore = Taxonomy.getDefaultKeywordTermStore(); const TermSet: ITermSet = TermStore.getTermSetById(termSetId); TermSet.get().then((termSet: (ITermSetData & ITermSet)) => { termSet.terms.get().then((termsData: (ITermData & ITerm)[]) => { let terms: ITermModel[] = []; termsData.forEach((term: (ITermData & ITerm)) => { terms.push({ id: term.Id.substring(6, 42), name: term.Name, color: term.LocalCustomProperties['labelColor'] }); }); this.properties.terms = terms; if (this.context.propertyPane.isPropertyPaneOpen()) { this.context.propertyPane.refresh(); } resolve(); }).catch((error) => { console.error(error); reject(error); }); }); }); }

As you can see once the terms are retrieved, we check if the property pane is opened. Because if the property pane is open, we want to force a refresh, so the color pickers are rendered for the terms we just received.

I have created a very simple model for terms with three properties: id, name and color. The color is a local custom property. This means that you can set an initial value as a local custom property on the term.

The final piece of the puzzle is to override the onPropertyChange method of the color picker. You can see I call ‘updateLabelColorField’. This method ensures that the label color for that term is updated on the model that I created. This model is passed on in the application and used to create the labels. The update method looks like this:

private updateLabelColorField(labelId: string, labelColor: string): void { for (let i = 0; i < this.properties.terms.length; i++) { if (this.properties.terms[i].id === labelId) { this.properties.terms[i].color = labelColor; } } }

As we saw before we retrieved a default color from the local custom properties of the terms. You could also save the changes you made here and update that local custom property on the term.

This are two ways you can build a dynamic property pane that serves your needs.

The post SPFx: dynamic property panes appeared first on Mavention.

Usage of SharePoint Image Renditions in Display Templates

$
0
0

The SharePoint online intranet of a Dutch company had performance issues. It took way too long to load certain pages. In a Microsoft Teams meeting I could take a look at the specific pages and the code via the Developer Toolbar of this intranet. It was clear that it were the news overview pages that caused the performance issues. And after diving in the html code of these pages I found out that for every thumbnail the complete full-blown uploaded picture was used.

The customer granted me access to their environment so I could fix this.

Image Renditions

First create another Image Rendition with the right dimensions. Image Renditions are different versions with different dimensions of an image that are automatically created when you upload an image in a SharePoint Library.
Via Site settings – Image Renditions I created an extra Image Rendition width the right dimensions.
It gets automatically an ID, in this case 5. We will use this ID in our Display Template.

image renditions

Display Template

Than check on the specific page which display template the search webpart uses. Download and change this Display Template (via Site settings – Master pages and page layouts – Display Templates).
I looked for the code where the image is set:

<img src="_#= imgtag =#_"/>

So the variable that is used is ‘imgtag’

Then I changed this variable from:

var pictureURL = $getItemValue(ctx, "Picture URL"); 
var imgtag = pictureURL; 

To:

var pictureURL = $getItemValue(ctx, "Picture URL");
var imgtag = pictureURL + "?RenditionID=5";

Upload en check-in the file and from now on this overview uses the much smaller image.
In this example a specific image is 3,93MB in the original format and 46kb in the rendition format.
Imagine an overview page with more than 50 news items and the difference in page load between these two scenarios!

If you would like to know more about this subject please contact me.

 

The post Usage of SharePoint Image Renditions in Display Templates appeared first on Mavention.

Top 10 Flow Best Practices That Will Improve Your Flow Game – Pt.1

$
0
0

Hands up if you love Flow! Flow is an incredible tool for automating various tasks and developing business logic. After an intensive migration, where I had to write numerous Flows for an enterprise company, I came up with the following best practices in this practical top 10 list, because we all love lists (and clickbait titles).

1.   Choose a language and stick to it

The language you use in an O365 environment can vary from account to account, or even from browser to browser. But why is it so important to stick to a language in a Flow? Well, look at the picture below.

Flow in different languages

Yes that’s right. We now have French, English, Russian and Dutch in a single Flow. Apart from this being confusing as heck, this will also occasionally break your Flow because the syntax differs between languages. An expression made in Dutch might not work when you open the Flow in English. That’s why you’ll be much better off just sticking to one language. Speaking of which….

 

2.   You know what? Just always use English

You can make your life a whole lot easier by just always using English as the default language. Why? Well it just makes troubleshooting so much easier when you can actually Google the error messages. IT consultants communicate in English on the internet, so there is really no reason why you shouldn’t as well.

If the language is set by the company and you can’t change it manually, just add ‘?l=en-US’ to the Flow url. The Flow will now open in English regardless of other language settings.

Flow URL

 

3.   Name your Actions (like immediately)

Yes. Guilty as charched. When creating a Flow I sometimes, in the heat of the moment, forget to name my actions. This isn’t a big problem with small Flows, but as your Flows get bigger, so does the complexity. This becomes apparent when working with dynamic content. If you are wondering if you should get the fields from “Get items 7” or Get Items 12” you probably should have named your actions.

This is something you should be doing from the get-go. Once you have created an Action and put a few actions under this action, you can’t rename the action anymore!!! So don’t make this mistake and name your actions right after you made ‘m.

No way to rename Flow Action

4.   Don’t lose your stuff. Make backups

My biggest gripe with Flow? There is no versioning and no recycle bin.

You made a change and now the Flow doesn’t work anymore? I hope you know what you changed.

Someone else made a change? No way of knowing what that was.

Or, god forbid, someone in your organisation deleted your Flow that took 4 days to build (Yes! This! Happened!). Sucks to be you, because it is gone!!!

Although there is no versioning, you can make backups by exporting your Flow to *.Zip. It is not ideal, but you will be glad you did this if you find your Flow broken or gone!

Export Flow

5.   Move your Flows to a service account using export/import

You have made an incredible Flow. It’s a business-critical process and it’s used daily by a lot of people and all is good. But you get a job offer somewhere else and your account is deleted. This is a problem, because the Flow and the Flow’s permissions are all tied to your account. Al of a sudden, the Flow is gone, and the business-critical process has stopped working. How can we prevent this?

You could add additional owners to the Flow, but what will happen if they leave? The best practice is to make an account especially for Flows (e.g. ServiceAccountFlow). This will make sure the Flows don’t evaporate once you leave the company.

The best way to do this is to export the Flow to *.zip and then import the Flow using the Flow Service Account. Just make sure this account has the needed permissions to run the Flow, because the Flow will now run under the Flow service account. As an added bonus, (Outlook) mails and notifications will no longer be sent from your account, but from the service account.

 

Pt.2 with tips 6-10 will be up soon!

The post Top 10 Flow Best Practices That Will Improve Your Flow Game – Pt.1 appeared first on Mavention.

Confusing RSS connector in Flow

$
0
0

During the holidays I finally found some time to play around with flow again. Currently I have running a recipe in IFTT that posts a tweet whenever I finish a book on Goodreads. The recipes uses a RSS trigger and a Twitter connector, and thus is pretty straight forward. To play around with that in Flow sounded pretty straight forward. However the RSS trigger is quite limited in Flow as I found out in the first few minutes. Flow provides the ‘when a feed item is published’ trigger that fires whenever a new item is added to a RSS feed. As Goodreads provides an RSS feed this sounded like a great way to start with. While the triggers does pickup the changes it does not return the full item, so you might be missing information when you try to process the data.

Flow RSS trigger

The flow RSS connector is standard connector that you can use to retrieve feed information and trigger a flow to run when new items are published to an RSS feed. Updates to existing data will not trigger the flow. As you can read in the documentation the trigger returns a wrapper object that contains all feed items. While your feed might return any RSS data the FeedItem that is returned in Flow will only return the following values:

  • Feed ID
  • Feed categories
  • Feed copyright information
  • Feed links
  • Feed published on
  • Feed summary
  • Feed title
  • Feed updated on
  • Primary feed link

In most cases the title and summary should contain the information you need. However if you are using a Goodreads RSS you will only be returned the title of your book and some HTML in the summary but no other information. So imagine that you would hit the Goodreads RSS for read books. When you use a feedreader or visit the source of the feed you will see that it will return an array of items. Each item looks like the following:

<item>
  <guid><![CDATA[https://www.goodreads.com/review/show/2622187455?utm_medium=api&utm_source=rss]]></guid>
  <pubDate><![CDATA[Wed, 02 Jan 2019 01:41:25 -0800]]></pubDate>
  <title>Icarus (Benny Griessel, #5)</title>
  <link><![CDATA[https://www.goodreads.com/review/show/2622187455?utm_medium=api&utm_source=rss]]></link>
  <book_id>28257568</book_id>
  <book_image_url><![CDATA[https://images.gr-assets.com/books/1450689371s/28257568.jpg]]></book_image_url>
  <book_small_image_url><![CDATA[https://images.gr-assets.com/books/1450689371s/28257568.jpg]]></book_small_image_url>
  <book_medium_image_url><![CDATA[https://images.gr-assets.com/books/1450689371m/28257568.jpg]]></book_medium_image_url>
  <book_large_image_url><![CDATA[https://images.gr-assets.com/books/1450689371l/28257568.jpg]]></book_large_image_url>
  <book_description><![CDATA[17 december. Het lichaam van Ernst Richter, internetondernemer en oprichter van de controversiële website Alibi.co.za, wordt aangetroffen in een ondiep graf in de zandduinen vlak bij Parklands. Hij werd sinds een maand vermist.<br /><br />Niemand binnen de politiemacht wil de zaak aannemen vanwege de politieke consequenties voor de betrokkenen én de onherroepelijke mediatsunami die zal volgen op de moord op een persoon zo omstreden als Richter. En dus mag de speciale eenheid 'de Valken' ermee aan de slag.<br /><br />Zelfs onder normale omstandigheden zou deze zaak al genoeg kopzorgen veroorzaken voor Bennie Griessel en Vaughn Cupido. De medewerkers van Alibi.co.za behoren niet bepaald tot het mededeelzame soort en de lijst van mensen die Richter met hun blote handen willen wurgen is lang. Maar de omstandigheden zijn allesbehalve normaal. Griessel heeft de fles weer ontdekt en Cupido is verliefd - op een van de hoofdverdachten.<br /><br />En op 24 december zet de verklaring van een jonge wijnboer de zaak volledig op zijn kop.<br /><br />Kerstmis zal nooit meer hetzelfde zijn.]]></book_description>
  <book id="28257568">
    <num_pages>395</num_pages>
  </book>
  <author_name>Deon Meyer</author_name>
  <isbn>9044973789</isbn>
  <user_name>Albert-Jan</user_name>
  <user_rating>0</user_rating>
  <user_read_at></user_read_at>
  <user_date_added><![CDATA[Wed, 02 Jan 2019 01:41:25 -0800]]></user_date_added>
  <user_date_created><![CDATA[Mon, 10 Dec 2018 14:04:59 -0800]]></user_date_created>
  <user_shelves>currently-reading, 2018</user_shelves>
  <user_review></user_review>
  <average_rating>3.50</average_rating>
  <book_published>2015</book_published>
  <description>
    <![CDATA[
    <a href="https://www.goodreads.com/book/show/28257568-icarus?utm_medium=api&amp;utm_source=rss"><img alt="Icarus (Benny Griessel, #5)" src="https://images.gr-assets.com/books/1450689371s/28257568.jpg" /></a><br/>
                                    author: Deon Meyer<br/>
                                    name: Albert-Jan<br/>
                                    average rating: 3.50<br/>
                                    book published: 2015<br/>
                                    rating: 0<br/>
                                    read at: <br/>
                                    date added: 2019/01/02<br/>
                                    shelves: currently-reading, 2018<br/>
                                    review: <br/><br/>
                                    ]]>
  </description>
</item>

However when checkout the response you get from the trigger it will contain only the following information:

{
  "id": "https://www.goodreads.com/review/show/2622187455?utm_medium=api&utm_source=rss",
  "title": "Icarus (Benny Griessel, #5)",
  "primaryLink": "https://www.goodreads.com/review/show/2622187455?utm_medium=api&utm_source=rss",
  "links": [
    "https://www.goodreads.com/review/show/2622187455?utm_medium=api&utm_source=rss"
  ],
  "updatedOn": "0001-01-01 00:00:00Z",
  "publishDate": "2019-01-02 09:41:25Z",
  "summary": "\n      \n      <a href=\"https://www.goodreads.com/book/show/28257568-icarus?utm_medium=api&amp;utm_source=rss\"><img alt=\"Icarus (Benny Griessel, #5)\" src=\"https://images.gr-assets.com/books/1450689371s/28257568.jpg\" /></a><br/>\n                                      author: Deon Meyer<br/>\n                                      name: Albert-Jan<br/>\n                                      average rating: 3.50<br/>\n                                      book published: 2015<br/>\n                                      rating: 0<br/>\n                                      read at: <br/>\n                                      date added: 2019/01/02<br/>\n                                      shelves: currently-reading, 2018<br/>\n                                      review: <br/><br/>\n                                      \n    ",
  "copyright": "",
  "categories": []
}

With that information it is quite hard to post a tweet as you are missing the image, and there is no information on user rating and the Author is saved to the summary.

How to retrieve all information from RSS

In order to retrieve all the information you require from your RSS feed you should do four actions:

  • Step 1 is to have RSS trigger that fires whenever there is an update
  • Step 2 is an HTTP call to retrieve the full XML for the RSS feed again
  • Step 3 is to create an XPath filter that allows you to retrieve the correct item. We require an Xpath as there might have been multiple updates on the RSS feed and we need the data for the trigger.
  • Step 4 is to retrieve the data based on the XPath

Flow actions to retrieve full RSS body

The first two actions are pretty straight forward. Just add a RSS trigger action and use the same URL in a HTTP Get call. Please be aware that the HTTP action will become a premium connector :-(. 

Flow trigger and action to retrieve data

Use XPath to get your data

Now that we have all our XML data retrieved from the RSS feed in the HTTP Get Action we can use a XPath filter. This XPath filter for Goodreads looks as the following:

concat(‘(//rss/channel/item[title = “‘, triggerBody()?[‘title’], ‘”])’)

As you can see it will construct a new string with the //rss/chanel/item[title=”booktitle”] XPath query. That way we you can be sure that you retrieve the correct XML. Now the next step will be to retrieve the book data element. That will be to simply apply the XPath query to the data retrieved from the HTTP request: 

xml(xpath(xml(body(‘HTTP’)), variables(‘XPathFilter’))[0])

From left to right you can see that we first make sure the response is returned as xml by adding a xml wrapper around the XPath. Second the XPath query can only be done against an XML element so the HTTP Body is wrapped in a xml wrapper as well. And as an XPath can return multiple values make sure to add an [0] to retrieve the first item that matches the XPath.

Flow action to filter data

The response of the Retrieve Book data action is a single item element. This element then can be used to execute additional XPath queries against to retrieve properties that are normally not available in your RSS flow.  For instance in the Goodreads RSS you can then create a new variable or inject directly into your action the following XPath:

xpath(xml(variables(‘BookXmlData’)), ‘string(/item/user_rating)’)

This will retrieve the user_rating for the book you have just added to your read shelve. You can use the XPath to retrieve any property or element from the <item> element you get returned. You can find more resources on how to use XPath on w3schools.

Using flow for tweets

I use the above flow to auto tweet whenever I finish a book on Goodreads. Before I used a recipe in IFTTT that did pretty much the same but didn’t require any hacking. The downside of IFTT is that you cannot tweak it at all. However the plus side is that you do get some better results in the RSS action than flow provides out the box. However now that I figured out how to retrieve all XML properties using XPath I feel I can use the same approach for some other scenario’s as well. I am a bit sad that the HTTP trigger will move to premium  though.

Originally posted at: https://www.cloudappie.nl/flow-confusing-rss-connector/

The post Confusing RSS connector in Flow appeared first on Mavention.

Transitions vs. animations in (S)CSS

$
0
0

Style-facts part 7

We all have seen some smooth transitions on fancy webpages. It looks cool, most is done with JavaScript. Not everyone knows most sliding open and close or scaling can be done with CSS. This might be even more efficient since CSS3. The 2 tools used for this are ‘transition:’ and ‘animation:’. But what exactly is the difference between the two?

Transitions

Transitions are used to bridge the transition between 2 states of a single or a few single style attribute(s) on an element. For example: If you want slide in a side menu from the left. The start position of ‘left:’ needs to be specified and ‘transition: left .5s ease’ need to be specified. Left in the transition refers to the attribute. .5s is half a second and is the duration of our transition. Ease is the type of transition. There are many others. In this case it is one that starts and ends smooth. Linear can feel a little mechanical. If you wanted to, for instance, make it fade in as well, you would specify ‘opacity: 0;’ and ‘transition: left .5s ease, opacity .5s ease;’ to do that.

Animations

Animations are for more complex movement. Unlike transitions, animations work with key frames. Which give you full control over the animation speed and properties you want to set. The keyframe function in CSS does not actually use keyframes, like we used in ancient times in Flash. It defines its animation in two different ways: from/to or in percentages. The first uses the words ‘from’ and ‘to’ like this:

@keyframes sunFadeIn {
	From {
		Opacity: 0;
		Background-color: red;
		Top: 10px;
	},
	to {
		Opacity: 1;
		Background-color: orange;
		Top: 0;
	}
}

This is used for simple animations and equals 0%/100%. To orchestrate more complex ones the latter can be used. Percentages give you optimum control over the animation, like this:

@keyframes sunFadeIn {
	0% {
		Opacity: 0;
		Background-color: green;
		Top: 10px;
	},
	5% {
		Opacity: 25%;
		Background-color: green;
	},
	10% {
		Opacity: 50%;
		Background-color: yellow;
		Top: 9px;
	},
	20% {
		Opacity: 100%;
		Background-color: red;
		Top: 8px;
	},
	100% {
		Opacity: 100%;
		Background-color: yellow;
		Top: 0;
	}
}

You can make these as complex as you want. Use multiple attributes and different transition on different percentages. On top of both types of complexity, animations can be decorated with the same duration and type of animation/transition speed as transitions. What transitions can’t do and what animations can, is define how it is playing. We can reverse it. Play it infinite times or once and we can make it alternate. You can even define the times it should play and if you want it to pause when you mouseover the animated object. Most things you can do with the keyframe syntax can be found here.

Keep it simple

Keep transitions simple and fast using transitions. Use the animation attribute for complex changes of styles. For objects that need to pulse and go crazy on the screen. Be aware to use both for magical and smooth movement on the screen. It’s easy to drive you’re users crazy with this stuff. So use it for the good, not the bad.
Until the next topic of the style facts and as always: If you have questions, comments or tips, please leave them below!

 

For extra style reading:

Part6: https://www.mavention.nl/blogs-cat/how-to-make-a-toggle-switch-with-css
Part5: https://www.mavention.nl/techblog/borders-and-shadows-make-design-magic
Part4: https://www.mavention.nl/techblog/how-to-make-a-responsive-grid/
Part 3: https://www.mavention.nl/blogs-cat/3-levels-of-css-selectors-to-select-all-elements/
Part2: https://www.mavention.nl/blogs-cat/shaping-elements-and-its-surrounding/
Part1: https://www.mavention.nl/blogs-cat/2-basics-you-need-to-know-to-position-elements/

The post Transitions vs. animations in (S)CSS appeared first on Mavention.

Begin 2019 goed met de Identity secure score check!

$
0
0

2018 is het jaar waarin Security en Data loss veel aandacht hebben gekregen, het nieuws over gelekte wachtwoorden en gevoelige documenten komt nog regelmatig voorbij waarbij zelfs Office 365 zelf niet gespaard blijft. Daarnaast introduceerde de EU de nieuwe GDPR regelgeving waar veel organisaties zich genoodzaakt zagen hun bedrijfsvoering op aan te passen. 2019 zal nog steeds in het teken van het optimaliseren van je beveiliging staan, maar ook het bewust maken van je gebruikers om zorgvuldig met hun gegevens en wachtwoorden om te gaan.

Om daarbij te helpen heeft Office 365 de ‘secure score’ geïntroduceerd. Dit is een score gebaseerd op allerlei instellingen binnen je Office 365 omgeving die het onderwerp beveiliging raken. Het belicht daarbij alle instellingen binnen je omgeving zoals Identitymanagement, devicemanagement, Apps & Data en je infrastructuur. Het kijkt bijvoorbeeld naar je wachtwoordbeleid en of je multi-factor authenticatie (MFA) aan hebt staan. Is dat het geval en is het juist geconfigureerd, gaat je secure score omhoog.

Het geeft dus inzicht in hoe je omgeving beveiligd is, maar let op, het is een hulpmiddel en dat een instelling de score omhoog brengt, betekent niet automatisch dat dit voor jouw organisatie de meest aansluitende instelling is.

Desondanks is het waardevol om te kijken naar deze score en om te helpen licht ik enkele instellingen toe met identiteitsmanagement als focus. Deze verbeteren je score zeker.

Identity secure score:

  • Activeer en verplicht MFA: inloggen met alleen een wachtwoord is makkelijk, maar uiteindelijk een zwakke schakel. Activeer daarom MFA, en mocht je dat niet voor je gehele organisatie willen aanzetten, doe het minimaal voor de rollen die admin rechten hebben.
  • Verzorg je wachtwoord beleid. De tips daarbij zijn, laat wachtwoorden niet verlopen, verzorg een self-service password reset en vraag de moderne passwordregels.
  • Deactiveer accounts die niet gebruikt worden in de laatste 30 – 45 dagen. Deze accounts worden namelijk niet met regelmaat bekeken door gebruikers en het wordt dus niet opgemerkt wanneer deze in verkeerde handen zijn gevallen.
    Review je Global admins beleid. Een paar belangrijke tips daarbij zijn:
    • Niemand zou Global Admin rechten aan zijn gebruikers account moeten hebben, gebruik daarvoor een duidelijk admin account (met MFA)
    • Minimaliseer het aantal Global admins accounts, maar zorg wel voor meer dan één global admin. Dit in het geval je toegang tot het één account verliest. Zorg ervoor dat andere admins alleen admin rechten hebben op de plekken waar ze het nodig hebben.
  • Automatiseer sign-in en user risk policy. Wanneer je namelijk handmatig accounts met verdachte handelingen moet gaan uitzetten, ben je eigenlijk al te laat. Laat dit automatisch gebeuren zodra een account verdacht wordt.

Deze tips zullen je secure score verbeteren, maar vooral ook tot een veiligere Office 365 platform leiden. Begin 2019 goed en besteed hier aandacht aan! Voor nu, alvast de beste wensen en een veilig 2019!

Wil je meer weten over dit onderwerp, neem dan contact met ons op.

The post Begin 2019 goed met de Identity secure score check! appeared first on Mavention.

Using AI to classify your SharePoint Data

$
0
0

Last year I finally had the opportunity to work on a real-life AI scenario. One of our customers was looking to auto-tag their data to improve findability. Based on the customer needs a colleague (Rutger) and I spend some time preparing a Proof of Concept. The goal of this PoC was to prove that AI can help in classifying the data. Rutger had started working at Mavention only a few months ago after graduating here. His graduation was on using AI and chatbots to interact with one of our products. When writing his thesis he spent some time on language understanding with LUIS. So when our customer asked us to see what we could do to classify their data we ended up spending some time together. We drafted some requirements to prove that we could get it all together.

The use case

Imagine a customer that has migrated to SharePoint Online. They run in a hybrid scenario due to legislation. They also have terabytes of old data that might one day be migrated. So there is a set of data that needs to updated with classification or metadata. The data is saved on file systems or network shares. There is no real way to identify the type of data as there is no metadata. There is some metadata in the contents of the files itself.

Using SharePoint Search

The quickest way to provide insights is to add the file share to the SharePoint Search results. Legislation can be met by adding it to the local index rather than include it in SharePoint online. That way on-premises users can search their file share for files. And SharePoint provides a full-text experience. SharePoint itself provides both Custom Content Enrichment to enrich your search results. The search engine also provides Entity Extraction to help determine the type of content.

Using Azure AI

Azure provides several AI solutions that can help to work with content. You can use AI solutions to determine the language of a file. You can extract several known properties as locations, keywords, numbers, and names. And you can use LUIS to determine the context of specific elements. The only downside of these AI services is that you have to feed it small chunks of information. You cannot push in the complete document. The Azure AI services work with between 400 and 5000 characters at max. So you will end up with some custom code to split up content.

The pudding

Back to our proof of concept. We have added our documents to SharePoint search for full-text search capabilities. We then implemented a content enrichment service. The content enrichment service allows you to extend the search results. Adding such a service allows you to add additional metadata to your search result. In our case, we used this content service to retrieve the contents of the document. These contents then were scrubbed to retrieve some metadata we could determine. We retrieved the footer of the document. Additionally, we retrieved the first few pages to check for a cover page with information. Based on this data we validated if the security classification allowed us to process it in the cloud.

Schematic view of AI & Search components

If the data could be processed in the cloud it sends to Azure AI services to determine language. Once the language is detected the correct LUIS training model is used to get intents and entities. This information can be passed back to the Content Enrichment service. By returning this information you can re-use it in your SharePoint Search center. The result is that you can use LUIS entities as refiners and filters.

Samples we played around where simple ones like project numbers or ISBN numbers. As well as more complex ones like authors (based on their display names). You can even train a model for topics for reports and letters. By injecting that data into the search index you can use those entities as refiners. Another option is to build your own display template to show those properties to your users.

I loved it!

I loved playing around with some of the AI options we have in a real-life scenario. The SharePoint Search experience combined with AI services proved to be a powerful combination. It allowed us to extract massive quantities of data that was not available before. In only a few days we managed to set-up a Proof of Concept. This allowed us to prove what type of data we could retrieve. And as training LUIS takes minimal effort we could show the whole process in a matter of days. Once you have proof that a scenario works you can move to a larger project to train a more complex model. Or retrieve more information using the AI services.

Originally posted at: https://www.cloudappie.nl/ai-classify-sharepoint-data/

The post Using AI to classify your SharePoint Data appeared first on Mavention.


OneDrive Sync client and Company Name

$
0
0

At Mavention we have been using the modern OneDrive sync client forever. It works perfectly and Files on Demand are awesome! However recently we encountered some new behaviour. When syncing document libraries the results looked different.

So what changed?

Previously whenever we synced a library the data was stored on the following path:

C:\OneDrive\Mavention

So syncing the imagine syncing the Content Type Hub. A new folder end up at the following path.

C:\OneDrive\Mavention\Content Type Hub – Mavention Templates

When using the File Explorer these folders are all aggregated. So opening the File Explorer you would see Mavention B.V as a header. We never paid much attention to this header but it is missing a dot (.).

As of the first weak of January 2019 this behaviour is changed. Whenever we now sync a new library it is saved to a different path:

C:\OneDrive\Mavention B.V

So syncing the imagine syncing the Sales Library. A new folder end up at the following path.

C:\OneDrive\Mavention B.V\Sales – Documents

The reason people noticed this change is the change in header. We now have an extra header in the Explorer. This extra header is not showing Mavention B.V but shows Mavention B as a header.

Mavention OneDrive new scenario

File explorer and dots

So what changed? Recently Microsoft pushed changes to the OneDrive Sync client. Due to these changes the OneDrive Sync client is now using the company name. The company name can be set in the Organizational Profile available in the Admin Center. Our company name has been Mavention B.V.

Due to these changes Microsoft Support provided us with guidance to no longer use dots in the name. As your File Explorer will strip out any dot at the end of a folder. You can find more on naming conventions on docs.microsoft.com.

So we ended up changing the company name in the organizational profile in the admin portal. By removing the dots we managed to future problems.
The only downside was that we had to instruct all our users to remove all active synced libraries and unlink their PC. Only by removing them and re-syncing them we could get them all under a single header again.

Originally posted at: https://www.cloudappie.nl/onedrive-sync-client-and-company-name/

The post OneDrive Sync client and Company Name appeared first on Mavention.

Migrate Legal Hold Information

$
0
0

Ever had to do a tenant to tenant migration? Me neither, up until a few months ago. A large merger resulted in two organizations that had to merge there Office 365 tenants. There are a few vendors that can help you in migrating SharePoint Online or Exchange Online. So moving that data was easy. The goal was to write off one of the O365 environments. All data had to be migrated to other O365 tenant. Once all data is moved the old environment was decommissioned.

Legal Hold information

So the requirement was straight forward. Make it possible to turn off one of the environments. The migration vendors and tooling out there made it easy to migrate most of the content. Yet it turned out that migrating the existing legal hold information was difficult. Office 365 out of the box does not provide a solution to export all legal hold information. And there were no vendors providing a  tool that could migrate this type of data. So we ended up experimenting with the Compliance Center and PowerShell cmdlets.

Compliance Center

With the ‘new’ compliance center you can view legal hold information. You can use the UI to find and export your legal holds. Luckily everything you can do through the UI can be done through PowerShell as well. So in order to keep all legal hold data, just two steps are required:

  1. Get all mailboxes including inactive ones
  2. Get legal hold information for this mailbox

We could not find legal hold information without retrieving the mailboxes first. Depending on the organization size you might have a large data set. Use the mailboxes retrieved to then get the Legal Hold information. This information can be retrieved using a Compliance Search. Once a Compliance Search is created it can be downloaded. Each download can be specified to contain a PST file per mailbox. Each PST file can then be stored. Make sure to save this information securely as it contains all emails per user.

PowerShell and Compliance Cases

In order to work with Compliance Cases and PowerShell, you need to include some Exchange Modules. These modules differ from the normal exchange modules. In case you are working with the normal Exchange modules as well you are required to load them separately. Loading them separately can be done using a prefix option in the Import.

#Connect to both Exchange as Protection and Security sets
$SccSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid/ -Credential $UserCredential -Authentication "Basic" -AllowRedirection
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication "Basic" -AllowRedirection
# Import Protection and Security with a prefix 
Import-PSSession $SccSession -AllowClobber -prefix cc
Import-PSSession $Session -AllowClobber

Once loaded you can use normal commands to retrieve all mailboxes. Keep in mind to load them all. Also, it makes sense to only retrieve mailboxes with a litigation hold. Based on that you can create a new Compliance Case. This case can be created with the New-ComplianceSearch. As we loaded with a prefix it becomes New-ccComplianceSearch. More details on the New-ComplianceSearch.

#Get all mailboxes 
$mailboxes = Get-Mailbox -IncludeInactiveMailbox -ResultSize unlimited | ? {$_.LitigationHoldEnabled -eq $true} | Sort-Object {$_.PrimarySmtpAddress}

# Create Compliance search and start it 
New-ccComplianceSearch -Name "Demo Export" -Description "Created to export all legalholds" 
        -ExchangeLocation $someArrayWithEmailAddresses -AllowNotFoundExchangeLocationsEnabled $true 

Start-ccComplianceSearch -Identity "Demo Export" 

Once you are done make sure to remove the active sessions using the Remove-PSSession.

Remove-PSSession $Session
Remove-PSSession $SccSession

It is a bit cumbersome to migrate legal holds. There is no clear documentation on what is supported. Nor is there tooling that makes it easy. Using PowerShell does speed up things. Keep in mind though that exporting sets can become slow depending on their size. Our best performance was reached using sets of 500GB. The documentation states 2 TB in their export limits but that only resulted in timeouts.

Originally posted at: https://www.cloudappie.nl/office365-migrate-legalhold/

The post Migrate Legal Hold Information appeared first on Mavention.

Use Flow for bulk edits (even on lists with more than 5000 items)

$
0
0

Ah yes. Bulk edits. The tedious job we hand off to interns.

Let’s say we have a SharePoint library with 15000 documents. In this list there is a ‘Document owner’ field. But… people move/get other jobs/win the lottery and won’t be around till the end of time. How can we easily change the document owner to his/her replacement?

  • One at a time is definitely out of the question. This will take your poor intern days to process.
  • In SharePoint you could use the excel like edit function. This will allow you to edit 100 items at a time, but it is reeeallly slow and clunky and it’s easy to make errors this way.
  • You could use PowerShell. This is an excellent solution, but for this you’ll need proper Tenant credentials (which in most cases you won’t have) and even worse… you need to know how to write PowerShell scripts.

Luckily, Flow can do this as well. And you can even make it so ‘normal’ users can use it.

Step by step explanation

Bulk edit Flow

  1. Manually trigger the flow with a button press. On button press fill out the original email address and the email address you want to replace it with.
  2. In this compose action we’ll make the query. The Query states:
    [ColumnName]/Email eq [email address] Which translates to: give us all the items with the same emailaddress as given at the start of the flow
  3. Now to get the files we need to change. We’ll add the compose output we made in step 2 and add this in the Filter Query. You could put a query in this field directly, but when you use variables in your query this will give you a syntax error.
  4. Here we actually change the email address to the ‘to be replaced’ email address we added earlier. Once you add this, you’ll get an ‘Apply to each’.
    NOTE!! The Apply to each has an item limit of 5000. So what happens when your library has more items?
    The ‘Apply to each’ only targets the items from the Filter query. So as long as the result set from the ‘Get Files’ from step 3 is below 5000 items there shouldn’t be any problem.

To do the bulk edit, the only thing you have to do is start the flow, fill out the original email address and the email address you want to replace it with and flow will do the rest of the work for you. And even if it changes 5000 items, it will only cost you one Flow run!!!

The post Use Flow for bulk edits (even on lists with more than 5000 items) appeared first on Mavention.

Using the Site Title in Flow triggered by a Site Design

$
0
0

Site Designs are awesome. There is no doubt about that! Yet sometimes you need more options. Or you might have to apply settings that are not available yet. Luckily you can call Flow. By using Flow you can call any custom option you want. There are some great articles that explain both scenarios. Check out Calling Flow from a site Script and Calling the PnP provisioning engine from a site script. However the samples do not provide details on how to get the Site Title in your Flow trigger. 

Use the site title in Flow

While the samples are great. It felt like there was something missing. In the provided sample the site title is missing. Imagine that you must provide a report that contains both the URL and title. The title is passed by the Site Script and thus can be used. In order to use the title all, you must do is change your Flow. In the Request – When a HTTP Request is received trigger you can change the body:

{
 "type": "object",
 "properties": {
     "webUrl": {
         "type": "string"
     },
     "webTitle": {
         "type": "string"
     },
     "parameters": {
         "type": "object",
         "properties": {
             "event": {
                 "type": "string"
             },
             "product": {
                 "type": "string"
             }
         }
     },
     "webDescription": {
         "type": "string"
     },
     "creatorName": {
         "type": "string"
     },
     "creatorEmail": {
         "type": "string"
     },
     "createdTimeUTC": {
         "type": "string"
     }
 }
}

By adding the webTitle property it will be usable in your flow. As the title is passed this is the only step you change. Once done you can use the title wherever you want. The title is available as a parameter and can be used in your process.

Flow action with webTitle available

Additional properties

Unfortunately if you need additional properties you can only retrieve them using a REST call. Triggers from the site design do not provide additional options. You can use either the Graph or a SharePoint REST. For calling a custom site provisioning engine the title and URL will be sufficient in most cases.

Originally posted at: https://www.cloudappie.nl/sitetitle-flow-sitedesign/

The post Using the Site Title in Flow triggered by a Site Design appeared first on Mavention.

Mavention is onderdeel van het Microsoft Teams/ SPFx Developer Preview Program! 

$
0
0

Microsoft Teams proberen?

Wij zijn trots om deel uit te mogen maken van het Microsoft Teams/ SPFx Developer Preview program! Dit programma helpt om onze technische kennis op de gebieden van Teams en SPFx nog verder te ontwikkelen. Het biedt ons de kans om feedback te geven, maar vooral om te leren van het development team voor Teams/ SPFx van Microsoft. Ons ontwikkel- en productteam krijgt zo de kans om vroegtijdig mee te denken over feature requests. Daarnaast gaan wij een bijdrage leveren bij het ontdekken van bugs & gaps, zodat deze opgelost kunnen worden.

Het betekent dat we dus nog sneller op de hoogte zijn van de nieuwste features en technische achtergrond van Microsoft Teams en SharePoint Framework. 

Met deze kennis kunnen wij jullie vragen en uitdagingen rondom dit onderwerp nog beter oplossen! 

Mocht je nog verdere vragen hebben, neem dan contact met ons op via info@mavention.nl 

The post Mavention is onderdeel van het Microsoft Teams/ SPFx Developer Preview Program!  appeared first on Mavention.

“Door echt goed te luisteren naar de klant kan ik verbindingen leggen.”

$
0
0

Kasia Razniak is accountmanager bij Mavention. Maar niet zomaar een accountmanager. Kasia is in haar rol ook verantwoordelijk voor product management bij Mavention. Twee werkgebieden die voor de conventionele wereld ver van elkaar vandaan liggen. Kasia denkt daar – net als haar collega’s van Mavention – anders over: ‘Ik spreek dagelijks met onze klanten en relaties over hun uitdagingen, hun agenda en over wat hen bezighoudt. Door goed te luisteren kan ik uitstekend verbindingen leggen en ons productteam daarmee voeden.’ We laten Kasia aan het woord.

20% maatwerk

Ik wil de wereld graag laten zien dat wij verder kijken dan alleen onze projecten. Consultancy, of wat wij weleens noemen: ‘de handjes’, geeft nooit 100% antwoord op de vraag die de klant stelt. Met het standaardplatform van Microsoft Office 365 kan een klant voor 80% heel goed uit de voeten. Maar er is altijd een hiaat van zo’n 20% waar maatwerk voor nodig is. Bij Mavention zorgen we dat de doorlooptijd en de kosten voor dat maatwerk zo laag mogelijk blijven.

Goed luisteren en signalen oppikken

Maar ook maatwerk kan opnieuw ingezet worden. Vandaar dat ons productteam in het leven is geroepen. Door goed te luisteren probeer ik te ontdekken hoe we dat maatwerk zo kunnen inrichten dat er een groter draagvlak voor is op de Nederlandse markt, zodat we er een out-of-the-box-product van kunnen maken. Door goed te luisteren en signalen uit de markt op te pikken, zijn een aantal van onze producten ontstaan.

Inspelen op frustratie en wensen breder trekken

Zo geeft Mavention Workspace bijvoorbeeld een duidelijk overzicht van al je SharePoint-sites, Office-365-groepen en Teams. Ik hoorde bij verschillende klanten dezelfde frustratie: Wanneer gebruiken we welke app? Waar vind ik mijn informatie terug? Wie zit in welke groep? Mijn gebruikers snappen het niet meer! Op die frustratie hebben wij ingespeeld met Mavention Workspace. Van Deloitte kregen we ooit de vraag een narrowcasting-systeem te maken. Dat is niet beschikbaar in Microsoft Office 365, en zij wilden dat wel graag kunnen gebruiken om zo actuele informatie te tonen op digitale schermen binnen het gebouw. En dat natuurlijk zonder handmatig informatie uit Office 365 over te moeten zetten. Uit die wens is Mavention Screen voortgekomen.

In co-creatie met onze klanten

Op die manier wordt ons productteam constant geïnspireerd door de wensen van onze klanten. Maar we werken ook graag in co-creatie met die klanten. Dat doen we niet alleen door te luisteren. Zo organiseren we jaarlijks een bijeenkomst waar onze klanten zich flink mogen bemoeien met de roadmap van Mavention Make en gaan we op zoek naar functionaliteiten die niet mogen missen in de volgende versie van Make. Zo willen we samen met de klanten onze producten nog beter maken.

Meer dan alleen handjes

Met dat constante contact willen we het verschil maken. We zijn meer dan alleen handjes. Heb je een vraagstuk dat je niet standaard kunt oplossen met Microsoft Office 365, dan kun je erop vertrouwen dat wij je gaan helpen. Soms met een van onze out-of-the-box-producten, soms toch met maatwerk, maar altijd tegen minimale kosten. Dat is wat ik bedoel als ik zeg dat wij staan voor onze klanten. Hun tevredenheid is onze belangrijkste KPI, en daar ben ik best trots op!

Meer weten?

Wil je eens sparren over wat we voor jouw organisatie kunnen betekenen? Neem gerust vrijblijvend contact met me op.

Ondertekening Kasia

The post “Door echt goed te luisteren naar de klant kan ik verbindingen leggen.” appeared first on Mavention.

Top 10 Flow Best Practices That Will Improve Your Flow Game – Part 2: Limitations and how to work around them

$
0
0

Did I mention Flow is awesome? You already knew that if you read Part 1 of this Blog.

In Part 1 I gave you a couple of easy tips, but now it’s time to delve a bit deeper. The next tips are all about knowing the limitations of Flow and how to work with/around them.

We’ll start off with an easy, but really handy tip.

6.   Zoom In/Out

It’s easy to lose yourself in a complex Flow. You can only see so much on the screen at the same time. Things can get especially clunky when you have conditional switches or parallel flows.

Flow doesn’t have the option to zoom in/out, but the browser does. Just hit CTRL+­ or CTRL- to zoom in and out. Your Flow will make a lot more sense when you can actually see it.

Zoom in/out

 

7.   Use Flow for bulk edits (and get around the 5000 item limit)

Flow is a fantastic tool for bulk edits. Say you have a list with 35000 items and you wish to change the email address of a specific employee. You might do this in the SharePoint interface, but that’s the textbook definition of a boring, tedious task.

You could use 1 single Flow run to do this for you. Just use ‘Get Items’ to Query the items you wish to change and perform an ‘Apply to each ‘to them

“But wait!” You say, “The Apply to each has an item limit of 5000. So what happens when your library has more items?

Excellent point. The ‘Apply to each’ only targets the items from the Filter query. So as long as the result set from the ‘Get Files’ from step 3 is below 5000 items there shouldn’t be any problem.

I’ve written a blogpost about this in which I get into more detail about this subject.
https://www.mavention.nl/techblog/use-flow-for-bulk-edits-even-on-lists-with-more-than-5000-items/

8.   Use the Recurrence action to get around the 30-day Flow Limit

Let’s say we have a flow that need to do something after a specific amount of time, or after a specific date. We have the ‘delay until’ action but know this: a Flow will only run for a maximum of 30 days. So, for example, if you have a document library with a ‘published date‘ column and you need to check or approve this document every year, the flow delay action ain’t gonna cut it.

You could however use the recurrence action. This will allow your flow to run on a set interval (every day, every week etc.). You can put a filter or a condition in the flow that will return every document past a specific date.

I did a complete article about this subject. You can find it here: https://www.mavention.nl/blogs-cat/is-flow-a-working-alternative-for-sharepoint-retention-policies/

As a side note: Microsoft is working on a new action that triggers on a date field. This is another fantastic way of achieving this result, but at the time of writing (January 2019) it has not been implemented yet.

9.   Change the trigger in code so it doesn’t cost you precious Flows

NewsFlash… Flows aren’t free. Sure, Microsoft gives you a few flow runs to play with, but once you really get into it you’ll realize that you have to get an extra Flow license to keep those processes running properly.

Now I won’t get into the licensing and ways to get more Flow, but what I can tell you is how to build better Flows that only trigger when a certain condition is met.

Full disclosure, this one isn’t mine, but it’s too valuable not to mention. You can find the blogpost here: https://sergeluca.wordpress.com/2018/04/11/trigger-your-flow-only-when-a-document-is-created-from-a-specific-content-type/#comment-12692

In a nutshell, what this blog explains is how you can:

  • Export a flow
  • Change the flow trigger to only run on a certain content type
  • Import the flow

This is great if you have a flow that triggers on a create item and only needs to do something on a specific contenttype. This will eventually save you thousands of Flows.

10.   Document your Flows

So, you made a Flow. It works perfectly and all is good. But will you still know how the flow works in 2 years’ time. Better yet, will anybody know in 2 years? Yes, you can open the Flow and figure out what it does. Flow is certainly more ‘readable’ then a SharePoint Designer Workflow. But it doesn’t say WHY it does these things. Also, you need a good understanding of Flow to figure out what a specific Flow does.

Yes, you’ll need to (gasp) document your work.

In my experience you do not have to write a 20-page document describing in great detail everything that is going on. I have written those documents and they have been read exactly zero times (I have statistics).

But everybody loves a good Flow Chart:

In a Flowchart you can give context to the Flow in a single screen. It is also a great tool to check your solution with the business, because everyone can understand a Flowchart.

You can make excellent flowcharts in Visio (and even import/export directly between Visio and Flow), but purely for documentation my preferred tool is https://Draw.io. It’s a free, easy to use online tool which also imports/exports Visio.

And there you have it. 10 Flow Best Practices That Will Change Your Life. I could have easily made it a top 20, so lets make this a serie. Until then…. Happy Flows for everyone!

 

The post Top 10 Flow Best Practices That Will Improve Your Flow Game – Part 2: Limitations and how to work around them appeared first on Mavention.


Nieuwe release: Mavention Make 5.0

$
0
0

Release-Make-5.0

Mavention Make 5.0

De langverwachte nieuwe versie van Mavention Make is eindelijk hier. In versie 5.0 introduceren wij nieuwe mogelijkheden die bedrijven nog beter ondersteunen in provisioning en governance van alle Office 365-tools. Hieronder vind je de highlights van Mavention Make 5.0.

Provisioning van Microsoft Teams

De mogelijkheid tot het uitrollen van Teams met behulp van templates is één van de belangrijkste wensen vanuit de gebruikersgroep geweest in de afgelopen maanden. Make 5.0 biedt ons deze functionaliteit. Onze klanten kunnen nu Team-templates maken met bijvoorbeeld branding, channels, documentbibliotheken en andere tools die de gebruikers nodig hebben om hun taken uit te voeren. En ook hier kun je, zoals je al van Make gewend bent, je governancebeleid toepassen en handhaven.

Controleren, valideren en corrigeren

Onze gebruikers hebben aangegeven dat ze graag eenvoudig bestaande Office 365-groepen, Teams en SharePoint sites willen kunnen vergelijken met bestaande templates. We konden deze wens matchen met onze visie op de ontwikkeling van Mavention Make. In Make 5.0 worden alle activiteiten rond Office 365-groepen, Teams en SharePoint sites bijgehouden zodat je een geschiedenis in je omgeving opbouwt. Hiermee kun je jouw omgeving valideren ten opzichte van eerdere sjablonen en eventuele verschillen identificeren en herstellen.

Update naar de nieuwste versie van PnP en CSOM

Ook Make 5.0 is, zoals onze klanten inmiddels gewend zijn, bijgewerkt naar de nieuwste versie van PnP en CSOM libraries.

Kortom, Mavention Make 5.0 biedt je belangrijke verbeteringen en nieuwe functionaliteiten waarmee je nog efficiënter, veiliger en effectiever aan de slag kan met Office 365!

The post Nieuwe release: Mavention Make 5.0 appeared first on Mavention.

Use paging to browse through the SharePoint recycle bin

$
0
0

When you remove an item within SharePoint it is not directly gone, it’s safely stored in your recycle bin. Nowadays with the boundaries in SharePoint it is stored in the recycle bin for up to 93 days depending on several factors. Restoring an item is as simple as browsing to the recycle bin, selecting the item and clicking the restore button.

When you have removed more items, you can use the infinite scroll in the modern UI or paging in the classic UI to locate the item and restore it manually. This is no problem when you have a couple of hundred files in there, it requires some time, but it is possible.

Now imagine that one of the users in your tenant has synced several libraries with the OneDrive client to their local machine. One day they decide it is time to clean-up as there is not a lot of disk space left, instead of stopping the synchronization before removing the files they just remove them. Now suddenly the recycle bin, that already had a lot of items, has a couple of thousand more files in there. Even to the point that SharePoint is not telling you how many items are in there and display’s 0 as the number of items in there.

Opening the recycle bin will show a lot of items, now it is your task to restore them, but just the items that are removed by that specific user between a specific time range.

First run

As it is a lot of items you need to restore you can use scripts in PowerShell to help you out. The first attempt could be to use CSOM in PowerShell and try to read the site collection recycle bin

$recycleBin = $clientcontext.Site.RecycleBin
$clientcontext.Load($recycleBin) 
$clientcontext.ExecuteQuery()

Again, this will not give you any problems loading the items if you stay within the boundaries and limits of SharePoint. Exceeding these will give you an error on calling the ExecuteQuery method on loading the $recycleBin.

Page through the recycle bin

Searching for a solution will probably bring you to the methods GetRecycleBinItems and GetRecycleBinItemsByQueryInfo on the Web object. The first method requests a couple of parameters where the second one requests an object with the same parameters appended with some extra options. Most properties are straight forward and by checking out some enumeration give a clear indication on how to use them.

Using the PagingInfo property

There is one property, the “PagingInfo”, that requests a string value without any explanation on how to format this string. I was not able to find any documentation on the value required to successfully execute the method. This required some digging into how the recycle bin page in SharePoint handled this kind of requests.

Starting out in the classic experience, so the recycle bin provides the paging controls, and the developer tool bar it turns out a JavaScript function named NextPage is called. This function is submitting a form “document.forms[“usrpage”]” from the page. This form seems to hold the answer for the PagingInfo property.

The usrpage form has a hidden input “nextItemCollectionPosition” with a value that is build with an id, title and searchvalue. These values are added to the input with url encoding.

id=<last-item-id>&title=<last-item-title>&searchValue=<last-item-deletion-datetime>

The data for the id and title are of the last item visible on the current page, the searchValue contains a datetime value with the deletion time of the last item.

Modify the PowerShell script

Using the id and title bit for the PagingInfo property seems to do the trick to get past the first page, as the first page accepts an empty value.

Calling the first batch of items will be:

$recycleBinQuery = New-Object Microsoft.SharePoint.Client.RecycleBinQueryInformation
$recycleBinQuery.ShowOnlyMyItems = $false
$recycleBinQuery.IsAscending = $false

$recycledItems = $clientContext.Web.GetRecycleBinItemsByQueryInfo($recycleBinQuery)
$clientContext.Load($recycledItems)
$clientContext.ExecuteQuery()

Using the results of this call will enable you to get the id and title value for your next call. To get the results of the second page you can use the following code

#Code for the last item in the result set
$nextId = $recycledItem.Id
$nextTitle = [System.Web.HttpUtility]::UrlEncode($recycledItem.Title)

#Code to fetch page 2
$recycleBinQuery = New-Object Microsoft.SharePoint.Client.RecycleBinQueryInformation
$recycleBinQuery.ShowOnlyMyItems = $false
$recycleBinQuery.IsAscending = $false
$recycleBinQuery.PagingInfo = "id=$($nextId)&title=$($nextTitle)"

$recycledItems = $clientContext.Web.GetRecycleBinItemsByQueryInfo($recycleBinQuery)
$clientContext.Load($recycledItems)
$clientContext.ExecuteQuery()

Restore items

While using the GetRecycleBinItemsByQueryInfo, or the GetRecycleBinItems, will give you the possibility to page through the recycle bin it will deliver restore issues. Items that you deleted yourself can be restored from the objects in the returned collection. When you try to restore an item that is removed by someone else it will present you an error.

The collection of items from the recycle bin does contain the item id. The initial thought is to then use the (Site/Web).RecycleBin.GetById method to retrieve that specific item and to then restore it. This works perfectly if you do not exceed the view thresholds in the recycle bin. If you exceed these an error will be thrown telling you that you exceed the thresholds.

As you have a client context you can request authentication cookies so you can call REST endpoints. However, if you call the “_api/site/RecycleBin(‘$($itemGuid)’)/Restore” endpoint it will give you an internal server error when you exceed the threshold. When you stay within the thresholds is will succeed and restore the item.

Conclusion

Using paging in the recycle bin to restore items when the thresholds are exceeded works fine if you only need to restore the items you removed yourself. The challenge will be a lot harder when you need to restore items removed by another person.

The post Use paging to browse through the SharePoint recycle bin appeared first on Mavention.

Diwug Teams Development

$
0
0

I really like developing stuff for teams. So it makes sense that Rick van Rousselt and I did a session at the DIWUG. We did an introduction to Teams Development. During an ordinary DIWUG night at Sharevalue, we presented two sessions.

Teams 101

As with all sessions, we started with an introduction. What a teams app consists of. How packages are constructed and what classifies as a team app. In that introduction, we did a demo of the App Studio. Since authentication is hard we also focussed extensively on how authentication works in different scenarios.

Adaptive cards and Bots

As most interaction in Teams takes place using a chat window the use of Adaptive Cards makes your live easier. Adaptive cards are easy to understand as they consist of nothing but JSON. However, they slightly differ from the ones you can use in Outlook. As we already demoed the App Studio we showed the card options you have from that interface. It will help you in quickly draft up a card. As teams allow for notifications and chatbots we also had some demos on those. As anything that is not a tab in teams acts like a bot there are some great samples.

Teams management

We ended the session with some of the options you have on managing teams in your environment. Using either PowerShell, O365CLI or the Graph is a great way to maintain sanity.

TLDR

Check out Developer opportunities at https://aka.ms/TeamsDeveloper & https://aka.ms/TeamsDevSupport. Then download the Teams Generator at https://github.com/OfficeDev/generator-teams. Play around with the samples provided at https://github.com/OfficeDev/microsoft-teams-sample-complete-node. And last but not least don’t forget to provide feedback: https://microsoftteams.uservoice.com/

Slides

You can view the slides on Slideshare:

Originally posted at: https://www.cloudappie.nl/diwug-teams-development

The post Diwug Teams Development appeared first on Mavention.

Flow: log approvals to get a grip on your approval process

$
0
0

The new approval center is a great out-of-the-box solution for all your approval needs. There is no need to make custom lists because all the approvals from all the different business apps are all collected in one place. Just add an approval action to your flow and you’re all set. You can even re-assign approvals straight from the interface.

Approval Center

BUT!

Such convenience comes with a draw-back. You can only see the approvals you send and receive. Which is fine for everyday use, but not ideal if you want to see if all the approvals are going is planned. As management you want to see if there are approvals that are not being… approved. If this is the case you can act on this information.

Although approval statistics aren’t part of the approval center (yet), with a little bit of help from Flow you can make this yourself.

Log your approvals in SharePoint

Create a list in SharePoint. Typically, it will have the following columns:

  • Approval request
    Text of the request
  • Name of requester
    The name of the person that started the request
  • Item Link
    The link to item or document that needs to be approved
  • Name of approver
    The name of the person(s) that approves
  • Approval outcome
    This will be either approved or rejected
  • Approval Comment
    The description why an item was approved / rejected

Now you can use Flow to create an item just before the approval starts. After the approval is done, you can fill out the approval outcome on the list item:

Log Approval

This way you can track  all of the approvals and act on them.

The post Flow: log approvals to get a grip on your approval process appeared first on Mavention.

Study guide MS-300:

$
0
0

I just found the new Microsoft 365 beta exams. Last year I haven’t done much of the exams. So it was time to catch up on some of the missing ones. The next few weeks I will try to do the Azure exams. As the M365 exams are still in beta I figured I could start with those.

I couldn’t find any study guides. So I quickly gathered most of the content. Based on the skills measured for MS-300 on the Exam site this short write-up:

Configure and Manage SharePoint Online (35-40%)

Configure and Manage OneDrive for Business (25-30%)

Configure and Manage Teams (20-25%)

Configure and Manage Workload Integrations (15-20%)

  •  Integrate M365 workloads with external data and systems
    • No link

I will attempt the exam next week, so if you have any tips let me know!

Originally posted at: https://www.cloudappie.nl/study-guide-ms-300-deploying-microsoft-365-teamwork/

The post Study guide MS-300: appeared first on Mavention.

Viewing all 715 articles
Browse latest View live