- Caught up with all the Slack.
- Rearranged google calendar to reflect the new routine and the Agile Team Meetings.
- Requested admin permission for the slack work space for clean up.
- Volunteered to do the videos on Hitfilm
- Shared the Serial Link trailers with Sarra
- Watched all the the week 5 material on Canvas and added a couple of things to my agenda for the meeting later today. I think that we may want to complete a full XD prototype before we get going with the Unity project, up for discussion.
- Had idea about music that fades in based on location and proximity to target or Threat Level and so on.
- Had Idea on reducing the progress toward the Clearance Level on Op failure. Would have to manage frustration though.
- A Random event could also be a bluff in that the Handler could just say “stop … ” and then say that the coast is clear, move on.
- Thought about the ways in which an Agent would be able to interact with the app
- Headphone buttons
- Change of location
- Change of speed
- Stopping / waiting
- The new name is Agent.
- Narrative is that you are a Sleeper and you are activated by the Handler so that you can work for the Agency (although we will want to think about what that is and what its called.) The narrative will be purposefully vague and contain just enough information so that the Op’s make sense and give an idea of what you are doing. There is just not the time to develop a full narrative and the structure that we are building will tell the guts of the story. The mechanism that we will use for this is that when the narrative would usually supply something specific we will have the Handler say that the information is classified or has been redacted for Clearance reasons.
- Installed Unity
- Installed MapBox in Unity
- Started a course on Plural sight for onboarding with Unity
- Carrying on with the Unity course
- Lack of knowledge with Unity, solving with course.
Done: Suggested to David that we produce the coding standard including a source control strategy as the next step. Worked on finalising the first iteration of the game play loop. Started to define ‘threat levels’ as part of the mechanics and fleshed them out a little in the doc. Came up with the idea of a Code Name Generator so that the Op’s get some more interesting and unique names. Came up with the idea of Random Events with the Op to add more interest. Found an interesting video featuring an ex CIA master of disguise talking about the craft and put that on the Inspire Channel
Doing: Working on the Dead Drop game play loop as an example of one of the Ops and hope to have the first draft of the entire App loop today.
Done: Uploaded video to YouTube, put images on my Journal (took them down again as there were not mine!), had Davids link to the XD prototype ready to go. Attended the meeting and gave part of the presentation which I think went well enough for the amount of time that we had to plan. Earlier in the day, I caught up with all the Slack channels and looked through some of the products that may be competition to us.
Doing: Finishing the gameplay loop to a first iteration standard so that it could be the target for the first working prototype, even if that has a ton of placeholder. Then if there is time, I would like to look into Unity for onboarding.
Currently, we are engaged in a group project that is centered around the theme of location aware apps. There are other constraints too but I don’t have the time to present a full run down. The purpose of this journal has shifted a little now and its not an academic, marked piece of work and more a place where I can record some what’s, where’s and why’s so that I refer back to this in the future. It’s content should be enough that it jogs my memory of the process and not a full account of everything, so apologies for that.
As part of week ones content we were advised to read a piece called “Step and Play!: Space as interface in the context of location-based musical album apps” found on the ACM Digital Library.
Using the space
The paper talks about the use of the environment in order to access content within a musical album that you own or are experiencing. So the idea is that you have to travel to different locations to play the different songs or that the audio changes and updates based on the listeners physical location. I like the sound of that (Ha!) but the hermit in my immediately thought, ‘yes, thats fine, but I have paid for this I will expect to be able to access it from right here in my throne!’. So, I must say that the purpose to which this material was applied in the paper didn’t really make me want to actually use musical albums but it was food for thought.
Music in urban environments
Now this was interesting to me. I found that it resonated with me because the author talks about the combination of the physical locations that are being traversed by the listener and the ‘invisible’ layer of music that sits alongside or over the top of that more visual and physical experience. He talks about the fact that personal music really was the first mobile AR system. And I think he’s right. A persons reality, how they experience the physical space in which they find themselves, is changed by whatever they are listening to. I have personally experienced this and found it very interesting to have a vocabulary with which to talk about the effect that the use of personal audio can have. The paper talks about putting the listener in the directors chair, that they have agency over this physical sense of hearing and all without any real consequence. There is nothing that the listener has really given up in order that they can do this. I know, I know, they need to more careful crossing the road and all that, but, essentially, they are still in command. It’s very satisfying to have that choice and in thinking about that, it’s made me think how much listening to music may be less to do with the music and at least a little to do with enjoying that agency. Hmm…
What is means to us
Our project (which will be discussed in other posts) will use this structure and we have agreed that the narrative and player directions will be delivered via headphones as much as possible. Our idea centres around making the player feel like an Agent and one of the key mechanics for that is that they will be able to interact with a fully voice acted Handler. I think that this will increase immersion and have the effect that the paper talks about. I hope that we will be able to augment the reality of the player using the headphones as much as any other technology we a planning to incorporate.
This is now the second app jam I have done an I am really noticing the benefit to having to come up with something interesting within the confines of the, er, constraints. This jam was based around the theme of the module which is ‘Location Aware Apps’.
The initial theme was then further constrained with the words Police, Withdraw, and Difficulty. We were told that there would be an added twist in that there would also be a set of modifiers added to the mix. The plan was that you would have to incorporate and, yep, you guessed, ‘modify’ the concept that had no doubt been bubbling in your brain. It served another purpose too. It stopped the Jam getting too technical to quickly and forced us to stay away from actual development and stay with the ideas a bit longer. I think that this was good for me as I tend to do two things, although I am working on both of them.
1: I jump straight in. When I have some idea that I think is awesome, I tend to want to run with it and I do wonder if at least some of that is done to me wanting to get to some sort of execution before the sense of the idea being ‘awesome’ dissipates. I like the excitement of having an idea, developing it out and then seeing it on-screen.
2: I seem to be in good company with this one and it’s the game devs old enemy, over-scoping. I get very carried away once my brain starts ticking and ideas spawn ideas and so on. I will say, although every over-scoping addict will say this, that I think I do have a good ability to stay focused and on track. I don’t come up with sets of mechanics that should really be in different games, it does all fit in with the idea that I am developing. There just tends to be a lot of it. I think that this is why I like Agile, Epics, User Stories and so on. Its a really good way of being allowed to get carried away and then making sure that you just pick that one or two features that you are going to make first.
The modifiers for the Jam were (pulled from Canvas, I dont have the time to type all this!):
- Happy Anniversary!
- Your app should incorporate one of GAM730 2018’s themes – either COMBINE, GHOST or EXPLORATIVE – in addition to this year’s three themes.
- Public information
- Use offline or live data from a public API in your app.
- LEGO Got It Right
- There are no spoken or written words in this app. This is even true in the instructions.
- Wrist Watcher
- The app is designed to run on a smart-watch, or uses wearable technology in some way.
- Use haptics (vibration/rumble) to make your app more accessible to people who have some difficulty seeing or hearing
- Where in the world is…
- App content changes depending on the user’s geographical location (GPS, IP location, etc). The user experience is therefore significantly different for people all around the globe.
So my Idea was to produce a mobile game that uses Geo-Location and Augmented Reality to provide the player with a secret agent type experience. The player would interact with a Handler, who is fully voice acted, and be sent out into the real world on Operations that they would be able to complete by going to a location, performing some action and leaving the area quickly. The suitable modifiers (or Diversifiers) as I have just realised they are called, may be Wrist Watcher, Rumbled and Public Information. I have lots more information about this idea and if I have more time I would like to talk about how I came up with it and perhaps present the entire concept.
The course has resumed and that means just one thing. Fatigue. Actually it means a few other things too. It means that I get to learn more about the things that interest me, I get challenged and I get to experience the feeling of accomplishment when the marks come in. Sometimes. Lets get on then…
This module is all about creating an app or game in a collaborative team. This presents a host of opportunities and difficulties and time will tell how many of each we will face. I am generally optimistic and a good problem solver but I have had mixed experiences with academic team work in the past as have many of the cohort. I think, and hope, that at this level working together should be enjoyable and relatively friction free.
What I hope to give
If we get to the end of this project and I have supplied the team with the very best ideas I could, have worked hard to provide the best technical implementation I could and added value to team members where I could, then I will consider the project a success. I do believe that a person just sort of ‘knows’ that they did all that they could, and thats what I will be aiming for regardless of grade. I also want to make sure that the people in the team feel that they can approach me with whatever they need to discuss. I think that its a persons responsibility to make themselves approachable in this kind of context and be mindful of tone, body language and other forms of non verbal communication so that a trusting and open environment can be created. I will endeavour to give honest feedback in a constructive way and will not be rude or insincere in my behaviour toward the work, the team or myself.
What I hope to gain
That said, I don’t want a poor grade! That would not be desirable at all… But, I think I have found that chasing the grade to closely is a recipe for frustration. So, I am focused more on what I believe the course is trying to steer me toward. I think that is the cyclical analysis of my weaknesses, the recognition of where I am falling down, the creation of some plan to solve that issue and the completion of some reflective practice to assess the success of that plan and the practice that it prescribed. On a practical level I hope to learn about AR in Unreal and how to make and deploy Android mobile apps. I also hope to gain a better understanding and appreciation for what works and what does not with regard to working within a distributed team. There will be challenges there and I hope to learn from them
How I think I will record what happens
I have had a little think about how I may record the progress that I make and that the team makes while creating the app. I think that the major areas to cover will be:
Me: Am I operating as I should? Am I completing the tasks and work that I said I would and to the standard that I would be happy with?
Team: Are we operating as a competent, supportive and professional team that is ready to support the needs of the members and also call out poor behaviours and attitudes that are detrimental to the project.
Work: Are we progressing with the work individually and on the whole. This is where supporting services like version control and communication tools may be reported on I think.
Project: I have made this separate from ‘Work’. This may be a look at how the project is going forward overall. Is it going forward as we had planned? Have there been any problems, pivots and so on. Are we able to stick to the Sprint plan and so on.
Lets get to it…