One of the most time-consuming parts of building maps is finding the data sets you want. I’m often forgetting where I found different datasets, and I wanted to share some of the ones I like to go back to.
Big thanks to motionarray.com for explaining this:
1. if your footage is the correct length, you can go to the projects panel right-click the element and select Interpret Footage > Main. in the dialogue box just select how many times you want to repeat it
2. Remapping Effect. Enable time remapping, option+click the stopwatch for time remap, Using the add Menu select Property > loopOutduration
If you need only a portion of your video looped… create a precomp of your footage, then use option 2 on your precomp. Just make sure the precomp is the right length to get the loop to match up correctly.
The loopOutduration command doesn’t seem to indicate where loops begin or end… But you want that on your first frame of the loop. ignore the other keyframes.
Find the .htaccess file in the folder where your index.html file is.
This will force anyone who visits your site to go to the HTTPS version of the site (Secure) and with the "www" subdomain.
By default, many websites show duplicate content without a subdomain i.e. benbrenner.com and with the www subdomain (www.benbrenner.com)
In the last few years Google has begun penalizing sites that allow this because as they are crawling your site, they see duplicate pages. For you to rank high in google, they want your information to be unique from the rest of the internet.
Add this to your <head> tag (won’t work in <body>): <meta property=”og:image” content=”http://url.to/preview/imageLocation.jpg” /> <meta property=”og:description” content=”Description of Conent” /> <meta property=”og:title” content=”Social Media Title” /> <meta name=”twitter:site” content=”@twitterhandle”> <meta name=”twitter:domain” content=”http://www.domain.com/”/> <meta name=”twitter:card” content=”summary_large_image”> <meta name=”twitter:image” content=”http://url.to/preview/imageLocation.jpg“> <meta name=”twitter:title” content=”Title for Twitter”>
I live and die by data, and my favorite way to convey data is through Google Sheets. I recently wanted to set up a way to track all outreach efforts by the non-profit I work for. Instead of manually adding facebook insight numbers on every post, I thought there must be a way to automatically pull those reach numbers from Facebook into my spreadsheet.
As I began my research the first several articles were clear product placement articles for Supermetrics. They want to charge you $100 a month to pull the data in for you. That’s quite a steep price for a non-profit, so I continued looking.
Turns out Facebook has a quite extensive API to accomplish this and Ana at MixedAnalytics lays out step by step how to set up an API token that doesn’t expire and some examples of the post data that you can pull using it. Here’s the link to the article which I highly recommend if you, like me, want a direct update to your google sheets data with Facebook analytics.