WORK         ABOUT        BLOG        CONTACT 






En




Work Collection


Blending aesthetics and technology
to create immersive experiences.


All Works   ‧   Artwork   ‧   Branding   ‧   Exhibition   ‧   Event


Project Type: Event
Category: Exhibition, Event

Client: Marvel Studio│Beast Kindom
Year: 2018
Location: Singapore ArtScience Museum


#Marvel
#Avengers

Marvel 10th Anniversary Exhibition


Since Marvel Studios introduced its first movie Iron Man, the movie universe of Marvel Studios was opened. During this past decade, many classic heroes were shaped on the big screen, such as Captain America, Black Widow, The Hulk, Thor, Scarlet Witch, Guardians of the Galaxy, and Ant-Man.

In order to celebrate the 10th anniversary of Marvel Studios, Ultra Combos was entrusted to participate in creating the Marvel Studios: Ten Years of Heroes exhibition, reproducing the classic scenes in Marvel's vast universe. With the popular heroes all standing in a line and through the experience of immersive space, the audience is guided to walk into the first grand decade of Marvel Studios together.





IRON MAN





CAPTAIN AMERICA




DOCTOR STRANGE





AVENGERS: INFINITY WAR





THEATRE









Marvel 10th Anniversary Exhibition


Curation:BEAST KINGDOM CO., LTD.
Producer:Herry Chang
Project Manager:Tim Chen
Creative Director:Tim Chen
Technical Director:Herry Chang
Programmer:Kyosuke Yuan、Hoba Yang、Nate Wu、Herry Chang、Yen-Peng Liao、Wei-Yu Chen
Art Director:Chris Lee
Concept & Storyboard:Chris Lee
UI/UX Design:Chris Lee
Visual Assistant:Jia Rong Tsai
Generative VFX (Dr. Strange / Ant man):Hoba YangHerry Chang
Motion Design (Captain America / Helicarrier / Bifrost / Theatre):MUZiXlll
Character Rigging (Theatre):Yoyo Chang

Video Installation Solution:甘樂整合設計有限公司
Hardware Lead:Herry Chang
Hardware Integration:Prolong Lai
Hardware Engineer:Wei-Yu Chen
Sound Design:The Flow Sound Design
Director of Photography:Ray.C

Photography AssistantYa-Ping Chang




Related Works:




Project Type: Ceremony 
Category: Event

Client: 3AQUA Entertainment  
Year: 2019
Location: Taiper Arena


#Laser
#rollingshutter
#audiovisualization

#webscraping
#opticalflow
#stagesimulator

G.E.M. - 30th Golden Melody Awards


In 2019, which happened to be the 30th Anniversary of the Golden Melody Awards, we were very honored to be invited by 3AQUA Entertainment to participate in the performance production this year. They hoped that there’s the possibility for Ultra Combos Team to import some technical elements.

The theme to be carried out in that segment was “Streaming,” where Chinese superstar G.E.M. interpreted the current TOP 10 most-viewed Chinese music videos on YouTube. With streaming as the theme, there’s no doubt that “Digital,” “Technology,” and “Internet” will be used as the main imagery. The music styles of these ten songs are quite diverse, where a larger proportion is love songs. Having to properly arrange technology as the core form and style into the performance is a subject matter that needed to be well thought out and delicately designed.

Golden Melody Awards (GMA) is an indicative “Ceremony,” where its nature is very different than a “Performance” and a “Concert.” There are a lot of details and restrictions to be aware of and the schedule is quite tight. Fortunately, through communication with Visual Director Guozuo Xu and various collaborations with 3AQUA, in addition to getting advice on visual designs for the performance, we also maintained room to play with technology.






Before having begun the project, we hoped that through the production process, what we see is what we get


── the construction of a tool that is closest to the live experience
The stage used large wavy LED frames as its theme and its irregular form makes it hard for the production personnel to imagine how it looks from different locations. In addition, the resolution is equivalent to seven Full HDs, so any modifications would cost the visual personnel a lot of time. Therefore, it was necessary for us to have a way to effectively cut down on the number of back-and-forth adjustments.

Therefore, we designed a simulator, where the content produced by After Effects is immediately transmitted into the 3D game engine Unity, where, similar to playing a first person shooting game, the visual worker can walk around the GMA site and view how the visual presentation looks like.







Twilight

You Exist In My Song


── Composed of diffused light in the air
  • Laser Projector + Rolling Shutter

A strong visual effect filled with digital attributes, which also matched the style of the song, was needed for the opening. After various researches and discussions with 3AQUA, in the end, we decided to use a relevant technique, “Laser Projector” + “Rolling Shutter,” to produce a light screen that surrounds the performer in the space.

This effect is a combination of two principles, one of which is the method of producing graphics with a laser projector and the other is the rolling shutter.



To avoid misunderstanding, here, laser projector refers to the laser projectors that are used to make light shows for general performances, because projectors that use lasers as their light source are also called laser projectors. It operates by a combination of shooting laser beams through mirrors rotating in different directions to achieve high-speed horizontal and vertical movements. Therefore, when we see the laser drawing out a horizontal line, it is actually a persistence of vision produced by a point moving at a high speed back and forth from left to right.

As for rolling shutter, when the photosensitive element of a digital camera is CMOS, one way of using shutter is by controlling the photosensitive element with electronic signals. It would sense row by row, so it would produce the time difference between the top and bottom. This type of electronic shutter will produce a distorted effect when filming fast-moving objects.

With these two attributes and the fine design of the camera and laser content, this effect which can be seen only through “digital transmission” is produced.

In addition, if rolling shutter lacks pre-stage testing, its final effect is completely unpredictable. We are especially grateful for the planning and coordination of 3AQUA, the testing facility and equipment assistance of ERA Television, and technical revision and equipment assistance of laser consultant 陳逸明.





以後別做朋友


── Instant visual effect using optical flow calculations
  • Optical Flow

First, we’d like to thank 3AQUA for assisting us in producing the visual effect for these two songs, 演員 and 以後別做朋友, when we were in a terrible fix, so that we could focus on working with laser and the rest of the songs.

During this song, the director has set for the monitor to show the on-set live broadcast of G.E.M. and hoped to layer some effects that have the same texture as the background visuals. After studying the performance of G.E.M., we realized that she is quite expressive with her body and imagery produced by detecting motion would suit very well. However, there are numerous performances on the GMA stage and the mechanisms are quite complex, setting up any type of sensor would be a huge burden; therefore, we could only process it directly from the live screen. We felt that the Optical Flow calculation method would be the best solution. However, because we were unable to predict the clothing, background, and camera position during the actual ceremony, so, during the early stages, we tested using the other performance segments from YouTube. Fortunately, we obtained pretty good effects during GMA Ceremony.

Using performance segments to test the optical flow effect
Credit: KKBOX






Love Confession


── Collect huge amounts of data from the Internet
  • Web Crawler

This song is a segment where the change in the music style is a lot more obvious and, up until the production, it was also the song with highest views on YouTube. The director designed for the distinguishing feature of these 10 songs having over 100 million hits to be presented in this segment. In addition to the very direct and massive rise of running numbers, another important trait of this new media, the streaming platform, is the participation of the comments from the audience. In the design of the image, we hoped to layer a barrage of comments.

The technician designed a program that could convert and save comments as pictures, which greatly shortened the production schedule of the visual production personnel.




The team took the 11,480 comments up until June 11th, 2019, and the information is automatically arranged, converted, and saved in a picture format.




癡情玫瑰花


── Reassemble fonts
  • Variable Font

In this group of songs to be performed, this “癡情玫瑰花,” which has an extreme connection locally, is a song with a special existence. It made everyone face the vulgar and amorous force. This was also the first time G.E.M. performed a song in Taiwanese at GMA. Due to the aforementioned many unique qualities, we decided to place the vulgar yet powerful lyrics onto the background of the stage and to use the most complicated and vivid color scheme of the entire performance, so as to maximize its vulgar and amorous nature.

Under the original font technology, to produce brightly-colored text animation, a lot of work hours were needed to break down the strokes and components. However, the JingXiHei VF font used was developed by Arphic Technology, and we could freely adjust the width and weight of characters, among which the strokes were pre-broken down, therefore, we were able to quickly make fine adjustments to the content of individual strokes and assemble them into the ever-changing dynamic display, putting together the approachable and brightly-colored performance effects that were seen in the end.






Stranger In The North


── Accurate perspective and audio visualization
  • Accurate Perspective

This song came after 癡情玫瑰花 and the change in emotion was quite huge, so, visually, we needed a larger turn. Therefore, we picked laser that would penetrate the venue, break the space barrier, and connect with the imagery, while also kicking off for the following “Inner World.”

The concept of “Inner World” comes from the text that we got the keyword streaming from in the beginning:

Carefully understanding and imagining, streaming is a concept that is both beautiful and crazy. In ancient times, when someone wants to communicate his idea, the fastest way is probably with a pigeon. Nowadays, idea comes in a new form. It no longer is a figure drawn by hand, but is often undistorted image and sound. Idea takes a new stance. Through the flow of non-visible information, it moves forward in the speed of light, like a portal which infinitely compresses the distance between two places. And next to you and me, we just need a device that can connect to the Internet, and this door can be easily opened to connect each other. There is a new city, a new world, behind the door.

The primary task of establishing the world behind the door is that certain cameras are able to film the correct perspective and effect. The simulator is once again used. Inside the simulated Taipei Arena venue, we set up a camera in the same position as that during the actual broadcast. The visual personnel only needed to design the images in this perspective and the simulator will automatically convert the images onto the LED monitor in the correct layout.




  • Audio Visualization

For this song, “Stranger In The North,” G.E.M. will be rapping by herself. As a mentor in a recent famous hip hop show, this segment will inevitably need to make the audience go wild. We decided to import audio visualization to enhance the connectivity of the visual and auditory perceptions, so the dimension of the performance can reach a higher level, while still being hooked on to this year’s theme “I SEE MUSIC.” Because the subtle changes in sound within a millisecond is very sensitive, so in order to make the connectivity between the visual and music more obvious, through filtering methods, such as Low-pass, Band-pass, and High-pass, we decomposed the sounds before using them.

In addition, we set from the beginning to import her on-site voice and produce corresponding real-time effects, so as to make her live voice even more vivid.




Video of Entire Performance







30th Golden Melody Awards



Ceremony Chief Producer: Isaac Chen
Ceremony Producers: Evan Wu, 莊佩禎, 彭佳玲
Performance Visual Coordinator: 3AQUA Entertainment
Visual Supervisor: Guozuo Xu

G.E.M. - STREAMING


Performer: G.E.M.
Visual Director: Guozuo Xu
Production Coordinator: Hsin Yi Kuo


Interactive Visual Designer & Technical Integrator: Ultra Combos
Producer: Nate Wu
Project Manager: William Liu
Creative Director: Jay Tseng
Art Director: Lynn Chiang
Visual Designers: Hauzhen Yen, Ting-An Ho, Kejyun Wu, Glenn Huang, Chianing Cao, Yohji Chen
Calculative Visual Artist: Kejyun Wu
Technical Artist: Hoba Yang
Technical Assistant: Wei-An Chen
Lighting Designers & Lighting Simulation: Dachai Chen, Xiaocheng Kuo
Laser Photography Instructor: 林寶財
Laser Technical Consultant: 陳逸明
Laser Photographer: 陳柏全
Filming Technical Consultants: 劉昇峰, 郭東洲, ERA Television



Related Works:




Project Type: Exhibition  
Category: Event, Exhibition

Client: Taichung City Goverment
Agency: Archicake Design
Year: 2018
Location: Taichung World Flora Exposition


#現地創作
#NDI
#GrayCodePatterns

Taichung World Flora Exposition
Phototropic Synesthesia


As an entry corridor to the exhibition, the team attempted to create immersive “Rinsing Sensory” experience, so as to deconstruct the normal cognition of the audience on exhibition centers and complete a special welcoming flow in all aspects, from visual and auditory aspects to spatial and atmospheric aspects








Concept 



With“Phototrophic Plants”as the main concept, image designs are projected onto plants, translating and enhancing the attitude and traits of the four main characters:“Rice, Fruits, Mushrooms, and Tea Leaves.” Using the monitors as a patchwork through the corridor, through continually stacked and abundant images, along with unending compression of space, a fantasy blend of“organic”and“geometric”coexistence. This will allow the audience to obtain a sense of openness in all areas when entering the indoor exhibition area.




Detail


Due to the demands of this project, visual effect personnel had to work on the visual effects to be projected on the plants on-site. And because the site is mainly outdoors, in order to avoid light pollution, the personnel can only carry out this task at night. And not wanting to sleep late, the time is very limited. In response to this challenge, we developed two tools to complete this task.

  • Projector-Camera Calibration Using Gray Code Patterns

Simply speaking, through this technology, the visual effect personnel can obtain an image, and this image can be viewed as “the picture seen from the projector. The principle of this technology is gray code patterns are projected on the objects where we need projection mapping using the projector. A camera is set up near the projector lens to record the imaging of the gray code patterns on the video-taped objects. Through calculating the projector-camera relation using the computer, we are able to simulate an image where the camera picture is converted to one that is from the perspective of the projector.
Actually, there are already some ready-made tools to complete this function; however, the images produced from these current tools are all in black-and-white and the details and accuracy are just barely satisfactory. In response to the quality requirements of the visual effect personnel, the team developed this tool on their own.

  • Real-Time Visual production Using NDI

After obtaining the image from the perspective of the projector, the visual effect personnel still encountered an issue, before projecting, there was no way to preview the visual effect on plants. Therefore, the personnel hoped that there was a system that can provide real-time visual projection editing, where one wouldn’t need to repeatedly calculate and export visual files. So, we made plug-ins for software such as Photoshop and After Effect, where the visual within the production software is immediately exported to the projection system. We used the Network Device Interface (NDI) as the interface for image transmission. NDI is a technology for internet two-way audio-visual signal transmission, where IP Ethernet networks is the main way of connection. Thus, this also let our system achieve a wireless working environment through WIFI.







2018 TWFE - Phototropic Synesthesis


Producer: Hauzhen Yen
Project Manager: Tim Chen
Creative Director: Jay Tseng
Art Director: Chris Lee
Technical Director: Nate Wu
Visual Designer: Chris Lee, Hauzhen Yen, Lynn Chiang, Glenn Huang, Alex Lu
Lighting Designer: Wei-Yu Chen


Sound Director: MUSDM
Sound Designer: MUSDM
Composer: MUSDM
Sound Mixing: MUSDM

Director of Photography: Ray.C, Book Ho
Assistant of Photography: Liang Fa Kan, Chen Kuan Chieh, Hsu Yuan fu
Production Manager: Jay Lee, Prolong Lai, Herry Chang, William Liu, Alex Lu
Assistant Production Manager: Peter Chen, Ya Ting Lin
Location Manager: Isabella Chang

Promote Video: Ting-An Ho
Promote Video Music Composer: IGLOOGHOST


Special Thanks: archicake design, Sean Yang, Sheng Yuan Hung, YUYUPAS, Tian Mi Xin Orchard


Related Works:




Project Type: Event 
Category: Exhibition, Event

Client: Universal Studio
Year: 2018
Location: Shenzhen Happy Coast OCT Exhibition Center


#Installation
#Museum
#Kids

A Minions Perspective World Premiere


The First Minions Exhibition in the World Is Landing Shenzhen! A Minion Perspective Exhibition —Limited Preview Tickets On Sale Now. In cooperation with Universal and IE, Blooming Investment will bring the exhibition [A Minions Perspective] to OCT Harbour in Shenzhen on 8th, December, 2018. Welcome to the world of Despicable Me! Follow the footsteps and laughter of the minions and experience every scene in the movie It'll be a brand new trip full of interactive fun in cinematic settings. Come and enjoy your exhilarating experience with the yellow elves! In the exhibition, visitors could visit the famous Gru’s laboratory, Girls room, moreover, more than 500 official minion products authorised by Universal are ready to go home with you! Interactive experience through lavish themes, AR interactions and other activities will also be presented in the exhibition. A minions perspective exhibition will bring warmth and positive energy to Shenzhen during this winter, providing “a must go spot” for you.










A Minions Perspective World Premiere


Interactive Visual Design & Technics Integration by Ultra Combos
Curation: BEAST KINGDOM CO., LTD.

Technical Director: Nate Wu
Programmer: Nate Wu, Hoba Yang, Wei-An Chen(@chwan1 ), Herry Chang, Jarvis Chung
Art Director: Chris Lee
Concept & Storyboard: Chris Lee, Lynn Chiang, Hauzhen Yen, Glenn Huang
UI/UX Design: Chris Lee, Lynn Chiang, Hauzhen Yen, Glenn Huang
Graphic Design / 2D Motion Design: Chris Lee, Lynn Chiang, Hauzhen Yen, Glenn Huang
VFX: Chris Lee, Hauzhen Yen
2D Motion Design (Pre-show Theater): Jim Hsu
2D Motion Design / Illustration (Girls' Room / AR App): Lichee He
Assets Modeling / 3D Animation / Lighting (Gru's Lab-Fart Gun): Chienche Wang

Video Installation Solution : 甘樂整合設計有限公司
Hardware Lead: Herry Chang 
Hardware Integration: Prolong Lai 
Sound Design: The Flow Sound Design
Director of Photography : Ray.C
Photography Assistant: Yaping Chang


Related Works:




Project Type: Concert
Category: Event

Client:  Blue Sky Production
Year: 2016
Location: Taipei Arena


#Particle System 
#Photogrammetry 
#Point Cloud 
#Fractal
#Audio Visualization
#LED Matrix

Tanya Concert LEMURIA


Singer Tanya Chua's 2016 World Tour LEMURIA revolves around the myth that in ancient times the earth was stunningly beautiful and replete with abundance; disease and violence were absent. When disasters later struck, the Lemurians entered the center of the earth to protect our world. Chua is one of these messengers, who uses music to soothe human beings and teach us how to love with her voice.

We were responsible for the visual designs of the three electro-style opening songs: "Strange species" (異類的同類), "Aphasia"(失語者), and "Film"(菲林), as well as the ECODOT LED Display in "Blank Space" (空白格).






The Songs as the Essence


In the beginning of the creation process, the team talked about how the melodies and lyrics of the three songs made us feel and how they inspired us visually. We brainstormed and came up with a few keywords and put these words into context for the storyline and the aesthetics settings.





Particle System


We developed a particle system to faithfully represent our aesthetic ideas. Point clouds were utilized as the key component of the visuals, simulating smoke, wind, and fractals to create a dynamic, psychedelic ambiance. With the use of Kinect v2, we are able to retrieve the point cloud information from performers in real time, thereby simultaneously layering the performer's action on top of the pre-made particle landscape.






Photogrammetry


The storyline of "Strange Species" begins in a wasteland, and the traces of human existence are revealed one step at a time. We utilized aerial survey techniques to record landscapes in Taiwan, even going as far out as Penghu Island. We documented various forests, valleys, and abandoned buildings, and converted them into 3D landscapes.








Sound Visualization


Sound is the key ingredient in concerts. In order to better relate the images to the sound, we entered the data from the drums, vocals, and other instruments as three different sets of parameters to correspondingly alter the motion of the particles.


Fractals


Fractals are replications of patterns that look exactly the same at every scale; its idea is simple yet itself is by nature complicated. To truthfully represent the nostalgic ambiance of  "Film", we zoomed in and out to create a time-travel feel and used photos Chua's from daily life to form fractals that were then interspersed into the 3D landscapes and the main visuals.




ECODOT LED Display


With "Blank Space" (空白格), we projected 3D images of the live performance onto a 3x3m and 10m height LED cubic column.







Tanya Chua LEMURIA Tour 2016 in Taipei


Concert Promoter: TERRA BREEZE
Artist Management: CRYSTAL RESONANCE
Concert Produced by BLUE SKY PRODUCTION
Director: Chang, Wen-Ling
Artist: Tanya Chua

Interactive Visual Produced by ULTRA COMBOS
Producer: Jay Tseng
Visual Script & Design: Lee, Wei-Tsung、Huang, Wei-Jhe
Interactive System & Generative Visual: Wu, SzuWei、Chen, WeiAn、Yang, ChiaHao
3D Scenes: Yang, ChiaHao、Chen, WeiTing、Chuang, TingFeng、Wang, JianJie
System Planning: Liu, WeiShen
Executive Assistant: Huang, Weiju
Film Editing: Ting-An Ho

Assistant to Director: Lin, I-Lun
Stage Manager: Snake Huang、Jiro Li、Evan Wang
Production Manager: Yang, Jia-Ming
Stage Design: Noodle Lee、Sam Lai、Liao, Chen-Liang│PLAYFUL STUDIO
Graphic Design: Yang, Hsiu-Min│SILLY CREATIVE
Lighting Design: Wu Yuilng │LUMI LIGHT STUDIO
Laser Design: Mac Huang │DENG YEE LASER
F.O.H Mixing Engineer: Hsia Chieh
Monitor Engineer: Chen, Jun-Hao
Project Director: Maggie Wang


Related Works:




Project Type: Exhibition 
Category: EventExhibition

Client: Taichung City Government
Agency: Cogitoimage International Co., Ltd.
Year: 2018
Location: 2018 Taichung World Flora Exposition Discovery Pavilion


#Interactive Theater 
#Generative Art
#Particle System
#Scanning Laser Rangefinder
#Multi-screen stitchin

Taichung World Flora Exposition Discovery Pavilion


團隊獲得卡爾吉特國際股份有限公司的邀約,參與 2018 台中國際花卉博覽會森林園區發現館的單元創作。發現館的設計理念為「在台中看見半個地球」,講述沿著大甲溪從海拔 0-3886M 的生態之旅。






Design



0-3886M


台灣位處於橫跨亞熱與熱帶的一個海島。
台中,這個島上的一個行政區,位處海平面的高美濕地開始,沿著大甲溪走至最高海拔的雪山,之間的海拔落差近四千米,造就了這個小地方的生態多樣性。

0-3886 成為了這件作品的敘事主軸,我們希望完成一個劇場,帶領觀眾透過生物的視野,在不同的尺度與角度,重新「發現」造物的巧妙與美好。



大甲溪


整個展館將 0-3886 分為五個海拔區塊,呼應大甲溪,以「水」貫穿。

於是我們選取了五個水字旁的字作為各海拔區塊的代表,分別是涵、沛、涓、澗、澄,將字的狀態轉化為視覺底蘊。牆面與地面的影像都以此為主軸作為切換。

為了加強海拔不斷攀升的感受,設計上使用了幾個手法,如「整體顏色色溫由暖轉為冷」,企圖透過視覺呈現體感溫度的改變,畫面從「海底開始到宇宙作結」與「不同海拔的生物接力出現」等。



涵|0 - 100M

探見海河交界豐富的生命力





沛|100 - 500M

認識水、植物與人的關係,人文發展




涓|500 - 1000M

山林動物出沒區域, 與生物共存




澗|1000 - 1500M

植物種類多元,繁盛





澄|1500 - 3500M

山區雨量最充沛的地方





浩|宇宙、微觀






沉浸空間


除了貼合場館的策展脈絡,「高度沉浸且餘韻無窮」,是我們設下的設計任務,如何細膩的處理「感知」成為了最重要的課題。

眼、耳、身體為處理感知的三大項目。面對眼,影片無疑是一種很好的敘事形式,合作團隊夢想動畫,在理解整體概念後,以奇幻的視覺風格與流暢的節奏,完好的製作了一個引人入勝的影片,並投影在一個面寬十八米的曲面上。面對耳,知名的聲音工作者林強,以自然聲景為基底,配合影片的內容,進行整個空間的聲音營造。面對身體,我們將地板打造為一個具即時互動性的演算視覺,觀眾的位置與動態成為了畫面中的一部分。

透過以上形式的組合,完成了在各種感知上包覆感十足的沉浸空間。










2018 台中國際花卉博覽會 發現館


主辦單位:臺中市政府教育局
展館策劃執行: 卡爾吉特國際股份有限公司
策展顧問:張光民
總體設計:邢福麟、蔡正祿
策展人:廖珮珊、林品涵
多媒體統籌規劃:王永廷

執行製作:叁式有限公司
專案管理:劉威利
創意總監:曾煒傑
美術總監:黃偉哲
技術美術:楊家豪
技術總監:吳思蔚
執行助理:賴柏榕、徐敏碩

動畫製作:夢想動畫公司
監製:葉傳耀
專案管理:侯冠如
導演:張晃榕
初期腳本發想:黃筱筑
美術設定繪製:方培安、李欣瑜、陳硯詠、翁聖和、黃騰毅
動態分鏡繪製:莊鎮豪
建模貼圖:張亦德、林廷穎、朱家靚、李哲誠、林思吟
動態製作:安良啟、鐘昀麗、林建隆、鍾孟穎
特效製作:林新華、林木清、林于傑、洪健淇、郭子維、梁世勳、陳俊良、柯嘉邦
燈光合成:陳俊霖、聞書聆、郭柏延、胡宏愈、許閎硯、 劉筱婷、黃思豪、李淑娟、潘紫涵、蔡幸霖
技術支援:賴大維

音樂設計:林強
多媒體系統整合:澳德設計

特別感謝:
雪霸國家公園管理處|廖林彥
特有生物保育中心

台灣石虎保育協會|陳美汀
奧圖碼科技

攝影:張伯瑞
紀錄影片剪輯:何庭安
紀錄影片音樂:高木正勝


Related Works:




Project Type: Theatre
Category: Artwork
Client: Quanta Arts Foundation
Year: 2011

#Performing Art
#Generative Art
#Computer Vision
#Projection Mapping

Seventh Sensen


Seventh Sense is co-authored by Ultra combos and Anarchy Dance Theatre, a council subsidy digital performing art project. Five sense generally refers to sensory organs like eyes, ears, nose, tongue, and body, the sensor of human body receiving stimuli and gathering information from the outside while the sixth sense is often considered as intuition, awareness, or foresight ability. The Seventh Sense is an interactive environment containing the performers and the audience; beyond the performing, the piece tries to bring the viewers and the dancers to a mutual and shared space in order to create an unimaginable experience afar from the known senses.





一個配合表演者一起演出的舞台


舞台由一個八米見方地面與寬八米高四米的三個立面構築而成,為一個全域式的互動投影空間。整個系統分為「偵測」、「視覺」與「聲音」三大面向。






無所不在的感應範圍


為了可以偵測整個空間,於天花板架設了四隻紅外線攝影機並拼接為一個完整畫面,即可取得舞台上所有人的輪廓、位置與動態等資訊。這資訊即可當作即時的資訊,進而產生對應的視覺。





程式演算構成的視覺


視覺部分,我們企圖讓畫面的構成與舞台上表演者行為間的關係最大化,所以大膽採用了全演算視覺的做法,沒有任何預先處理好的素材,內容皆為即時動態生成。

另外也實驗了許多演算視覺的可能性,像是利用透視手法打破既有牆面的空間感,或是加入於時間與空間上具備連續性的隨機參數 (Perlin Noise) 堆疊更多的變化細節等。


跟著表演者一起移動的音場


對於聲音,與視覺的期待一樣,我們希望所有的聲音皆對應舞台上所有的變化。但如果音樂與行為有強烈的關係,在氛圍的鋪陳與和諧度是有很大的限制的。所以聲音的部分使用折衷作法把音樂與音效拆開,音樂使用預先處理好的編曲,而音效則與表演者行為綁上關係。另外我們將喇叭置於觀眾席的四個角落,當表演者在舞台上移動時也會隨著其位置改變音場。







Media / Press

This Digital Dance Space Reacts to Performers  Movements in Real Time - Hyperallergic.

Event

2014, 06/13-15, 香港兆基多媒體書院, 舞過介系列
2014, 01/10, 阿拉伯聯合大公國, 第六屆阿拉伯戲劇藝術節開幕演出
2013, 09/08, 英國 Cardiff, WSD2013 年會中, 開幕演出
2013, 01/11-12, 美國紐約Japan Society 演出
2012, 10/19-20, 西班牙 Centro Parraga 演出
2012, 09/21-22, 荷蘭 TodayArts Festival 演出



Seventh Sense


Concept / Choreography: Chieh-hua Hsieh
Software Development / Visual Design: Ultra Combos
Sound Design: Ultra Combos
Lighting Design: We Do Group
Dancer: Hsiao-yuan Lin, Shao-ching Hung, Tai-yueh Chen, Yu-chieh Lee,
Supported by Council for Cultural Affairs of the Republic of China



Related Works:





Project Type: Theatre
Category: Artwork
Client: Quanta Arts Foundation
Year: 2015

#Performing Art
#Real-time 360° Projection Mapping
#Kinect v2


Second Body


After 'Seventh Sense', we continued to work with Anarchy Dance Theatre. The concept for this production, according to the piece's choreographer Jeff Hsieh, began with the perception that, while driving through lanes and alleys, he relied on his innate sense of the vehicle itself—its size, movement, and position—as if it were his own body moving through space. Body movements are primarily subconscious acts honed by knowledge, the senses, and practice, and when our body is able to utilize machines, a "Second Body" is made possible through practice.

The work "Second Body" focuses on concepts of the body, exploring how its existence and very definition may change in face of an increasingly technological world, where the external environment and physical space in which we live are themselves changing.





Service / Idea


With "body" as the thematic core of this piece, we wanted to create a system that could randomly alter the body's form and appearance. To this end we constructed an auditorium with an 8mX8m stage surrounded by the audience, wherein the dancer in the performance space are covered by 360° projection, and the floor's interactive projection becomes an extension of the dancer's own body.




Workflow / Detail

Before development began on this project—because the visuals and technological technical solution had not yet been decided upon—there was much in the way unknowing. To ensure the balance between technology and aesthetics, we extended the production to comprise three distinct stages.




Phase One: Technical Development


At this stage our aim was to experiment with visuals at a definite scale to both develop and understand the performance capabilities of our system. Our goal was to create projected images that perfectly adhered to the body of the performer.

The first problem we faced was how to create seamless projections on the dancer in real time. At that time, the most accessible advanced sensor was Microsoft's Kinect v2, and our detection system was thus developed using this sensor as its basis. After some trials and testing, we had decided to install four sets of Kinects around the 8mX8m stage, thus generating a 360°, seamless point cloud of the dancer for model creation.

The next issue was how to project images onto the body of the performer, the resolution of which comprised both linking the positional relationship of the projectors and sensors, as well as adhering the images to the model obtained by the detection system. The former required establishing a checkerboard grid that acted as the projector's utilization plane. The sensor could then determine its positional relationship with the projector by assessing positions on this plane. For the latter we used common environment mapping technology via 3D software, rendering the model's appearance into a mirror image, which the Skybox could then successfully reflect onto the body of the performer.

After establishing the basic system, we further classified the visual stimuli into two categories to test and develop their potential: the first type, pre-render videos, was established after many trials; the second type was generative visual during performance.





Phase Two: Content Creation


After completing the first stage, members of the production team better understood the qualities and limitations of the detection system, and were able to flesh out the finer details regarding script and performance structure. The creation team switched focus to content creation and the development of a more user-friendly editing system. To better grasp the relationship between the dancer and the on-stage visuals, the team spent an entire month staying in the performance space rehearsing, honing, and refining movements.





Phase Three: Establishing an edition suitable for touring


Because of the piece's unique presentation and large-scale technical requirements, we specifically designed a hardware installation, logistics method, and calibration process for touring.







Event


2016.10.22 One Dance Week, Bulgaria
2016.06.17-18 Les Rencontres Chorégraphiques Internationales de Seine-Saint-Denis, France
2016.05.21-22 The Polytech Festival, Russia
2016.5.13-14 Centrum Sztuki WRO / WRO Art Center, Poland
2015.09.4-5 Ars Electronica, Austria
2014.10.31-11.2 Performing Arts School 36, Taipei






Second Body


Concept / Choreography: Chieh-hua Hsieh
Software Development / Visual Design: Ultra Combos
Sound Design: Yannick Dauby, Ultra Combos
Lighting Design: We Do Group
Custom Design: Yu-teh Yang
Visual Operator: Hsiang-ting Teng
Soloist: Kuan-ling Tsai
Dramaturgy: River Lin
Tour Management: AxE
Arts Management Project Mentor: Justine Beaujouan
Project Consultant: Kevin Cunningham
Commissioned by Quanta Arts Foundation / QAring
Supported by Ministry of Culture of the Republic of China



Related Works:




Project Type: Exhibition 
Client: BMW MINI
Location: Nang Gang Exhibition Hall
Year: 2016

#Installations

MINI: The Next 100 Years


The BMW Group is playing a crucial part in shaping the mobility of the future – and constantly reinventing itself in the process. Its evolution from aircraft-engine manufacturer to premium mobility service provider is quite unique. Discover 100 masterpieces from the BMW Group's first hundred years through selected exhibits at the BMW Museum. All are part of our successful history and motivate us to keep driving forward.








Making Of


為了使過去百年來巨量的資料以簡潔方式呈現,因此在擬定方案過程中,我們將操作模式大為簡化,獨留「過去」、「現在」、「未來」三大主線,並將次要的資訊留在子資料夾之中,透過乾淨的層級設計以動態引導參觀者閱讀 MINI 的史料。為求在展覽現場人山人海的情況下也能互動良好,我們將互動操作區域安排在最貼近觀眾的位置,讓觀眾以極簡便的方式操作使用,透過簡潔的物理流動,操作者在觸碰面板上得以隨意滑動,並在如大海般的照片群中輕鬆切換與悠遊。




為了在高度商業的表現下維持與之相稱、且不流於俗的水平,在期間也針對佈局與動態,向團隊多次提出微調及刪修。設計與製作皆完成後,由於客戶回饋與現場反應良好,後續於全國的旗艦店與展覽,也都陸續追加了互動牆以及資訊展示平台(Kiosk)裝置。




Related Works:




Project Type: Event & Branding 
Client:  Heineken
Location: Kelti International
Year: 2013


Heineken Time Traveler


To celebrate Heineken's 140th birthday in Taiwan, we created a time travel interactive booth in the heart of the Taipei city. It takes you travel through time to explore the magic of Heineken. The interactive booth is 6 meters wide 3 meters height 3 meters depth, with 3D projection mapping on surrounded walls to provide a real time travel experience. During the interaction, you will use the control device – leap motion, to explore the magic of Heineken in 140 years. Once the time travel is finished, a live message is posted on your facebook to announce your journey. You will also get an exclusive bottle of Heineken with your name on it as souvenir.






Making Of:






Related Works:



Project Type: Event
Client:  Luxgen
Location: Taipei City Lake District
Year: 2016

#體感互動
#VR
#親子互動

Luxgen Genius


LUXGEN GENIUS+ 體驗館打破傳統汽車展示中心的模式,選擇在商辦重鎮打造高科技、人性化的賞車空間,包括:1:1 全比例虛擬車型的「全數位虛擬賞車平台」,感受實車比例,運用4K影像技術提升畫質,民眾透過觸控式螢幕選擇喜愛的車款,可立即呈現在大螢幕上,清楚了解全車內外觀。




Children Area


除了虛擬賞車之外,「觸控式購車銷輔系統」以人性化的購車服務系統,提供消費者在購車選擇上的便利性,有效提升銷售服務效率並增加顧客滿意度;而 LUXGEN 也特別打造「智趣體感兒童互動區」,由接待人員發給小朋友 LUXGEN 全車系六車型線稿圖紙,進行塗鴉著色,並投影在大螢幕上的汽車世界,透過指尖的觸控,每台車都會有不同的動畫呈現產品特色,增加童趣,LUXGEN 也希望透過互動區體驗,開發出小朋友對汽車設計的興趣,並創造出屬於自己設計的車輛。







Related Works: