I used the NY Times Events Listings API to find events near me (based on browser geolocation data), and plotted them on a map using Google Maps API.
Check it out here. You can see the code on Github.
I used the NY Times Events Listings API to find events near me (based on browser geolocation data), and plotted them on a map using Google Maps API.
Check it out here. You can see the code on Github.
For this week's assignment:
This week you’re going to design a word. Pick a word and make a typeface around it. The important thing is that you draw the letters using some kind of rule-based logic. Look at the examples I showed in class, and try to come up with your own typeface system - even if it’s very simple. Your goal is to convince me that you can make a typeface that is better constructed in code than in Illustrator.
This is what I created in Processing:
The code for this sketch is up on Github.
For this week's assignment:
This week you’ll create an abstract representation of your identity. Pick one of the color schemes from today’s lecture and write a Processing sketch that outputs abstract forms in colors using your chosen scheme. The only constraint - besides that it needs to as simple as possible - is that the colors have to be different every time you run your sketch. This means that you will need to use “random()”. This is an exercise in writing code that is generative, but within constraints. Every output should be different, but still effectively communicate who you are.
I made this:
The code is on Github!
Three short assignments this week as we continue to learn/review JavaScript. Additionally, we have to think about what out first mashup will be.
var hash = "#";
for (var i = 0; i < 6; i++) {
console.log(hash);
hash += "#";
}
for (var i = 1; i < 101; i++) {
if (((i % 3) == 0) && ((i % 5) == 0)) {
console.log("FizzBuzz");
}
else if ((i % 3) == 0) {
console.log("Fizz");
}
else if ((i % 5) == 0) {
console.log("Buzz");
}
else {
console.log(i);
}
}
var string = "";
for (var i = 0; i < 8; i++) {
for (var j = 0; j < 8; j++) {
if (((i + j) % 2 == 0)) {
string += "#";
}
else {
string += " ";
}
}
string += "\n";
}
console.log(string);
For our first assignment we have to create a single web page experience that, upon user input, responds with data from at least 2 web APIs.
Foursquare Heatmap
One idea I have is to create some sort of Foursquare Heatmap. I like maps, and I like Foursquare. I could use the Foursquare API to retrieve data on venues that are trending, or have high ratings, and then map their locations using the Google Maps API. I feel like this has probably been done before (and Foursquare's website somewhat functions like this) so I want to think what other value I can provide, whether it's functional, entertaining, etc.
I Want to Go to...
A mashup that, once again, involves maps. Users would type in a city or country they want to visit, and it would pull up Instagram pictures that are geotagged.
What Should I Eat?
A Foursquare + Foodspotting mashup. Maps restaurants in a given location, and populates the page with photos of suggested dishes to eat.
For this week's assignment...
Write a Processing sketch that outputs 2 shapes on a page. The first shape should be inspired by the word “wet”. The second shape should be inspired by the word “sharp”. Use only black and white. You have to use beginShape(), and all vertex points have to be created in a for loop. No manual plotting.
Here is what my code generated:
It was also the first time we printed at NYU's Advanced Media Studio (AMS), using professional printers and high-quality paper. The print looks really great (relative to a regular laser printer). You can find the code on Github.
For our very first assignment, we're creating a very simple web page that displays data we found through an open API. I chose to use the NY Times' API on the most emailed, shared, and viewed articles. You can check out my web page here (last updated on 2/6 at 3pm).
The URL that I used to retrieve this data (which returns JSON data) is:
http://api.nytimes.com/svc/mostpopular/v2/mostshared/all-sections/1/?api-key=[XXXXXXXX]
I haven't included my API key here for obvious reasons, but you would insert your own key in order to get the JSON data. The URL encodes the following elements:
The returned JSON data looks like this (for the first entry):
For our first assignment:
On ITP’s laser printer, print a PDF file generated via a Processing sketch. A few rules about your code: You can only use black (0) and white (255). You are only allowed to use triangle(), rect() and ellipse() once each, and no other drawing functions are allowed (no beginShape or images). Bring to class a design of an ice cream cone. Yes, an ice cream cone.
This is what I designed:
For my final project, I'm going to continue working on the game I created for the midterm. Based on the feedback I received in class and further thinking and testing of other games, I've decided I want to create more of a 'playground' rather than a game with specific goals.
What my classmates found most interesting where the character designs and the inspiration I took from the world of pixels and 8-bit sounds & visuals. Many were reminded of old video games they used to play decades ago.
With all of this in mind, I'm going to implement the Box2D physics library so I can add collision detection to all the characters and objects, as well as more interesting interactions like jumping. I want to keep the controls simple since that was key to making the midterm project an enjoyable experience. It was suggested that I add a feature that allows users to say predetermined "catchphrases" by, for example, hitting the keys 1-9.
Ultimately, I'd like the interface of the game to be through mobile devices. I'd like to port it over in JavaScript to PhoneGap, or perhaps I could simply tap into the mobile-enabled browser data (like the accelerometer and gyroscope). I'd want the experience to be a shared experience and one that takes place in physical proximity, rather than one dispersed across browsers in different locations around the world. Ideally, this would be played in front of one main screen, where players can connect with their mobile browsers and play in the same physical and digital space.
I want to explore the physical manifestation of sound. Specifically, I'm interested in the field of cymatics: visualizing sounds. There's many different ways of performing one's own cymatics experiment, with different materials and set-ups. I'm interested in using a driver mounted on a speaker, which is then attached to a metal plate. When sound frequencies are played through the speaker, the plate will vibrate, causing the material on the plate (any solids or liquids) to visualize the vibrations, or sound frequencies.
I started performing my own experiment using an old speaker I found at ITP's junk shelf and a plastic container with some water.
This is just a proof of concept. I would like to amplify the sounds to get stronger vibrations, use a metal plate because metal has better resonance than plastic, and use a solid material like sand in order to give the visualization of the sounds greater permanence.
This concerns the visual output of my project. For the input (what triggers the sounds/vibrations) I wanted to use my tennis racquet recordings, in order to visualize the frequencies of tennis racquets. The concept would include two metal plates, each reacting individually to two separate tennis racquets which are playing out a match, visualizing the tennis match through sound.
My concern is that the sound of a tennis racquet hitting a ball lasts only about 1 second, producing incredibly short-lived vibrations. The patterns in water or sand caused by the vibrations take place through sustained frequencies, so 1-second sound clips will not suffice to generate interesting visuals. I'm considering extending the length of the recordings in order to draw out the sounds I recorded, but then I'm afraid I'd be losing the sonic quality of the tennis racquet hitting a ball.
Another approach to this would be using gestures to generate different sound frequencies, which would then be visualized through cymatics. I think this could be a playful way to learn about sound physics, while generating interesting visual patterns. I could implement this using the Leap Motion and Processing to map hand gestures to different sine wave frequencies, and play these sounds through the speakers in order to visualize them.