• 홈
  • DATABASE
  • Interview
  • 메일쓰기
  • 페이스북
  • 트위터

Interview

Chief Researcher IM Jae-ho of VFX studio eNgine on the future of virtual production

Jan 05, 2021
  • Writerby KIM Su-bin
  • View356
“It is important to generate a success story with VP”


As the end of the pandemic still seems far away, virtual production (hereinafter referred to as VP) is picking up steam. David MORIN, chairman of the US-based Virtual Production Committee, defines virtual production as the ability to blend live footage and computer graphics instantly, get feedback in real time, and make decisions about VFX and animation while on the set. eNgine Visual Wave is a VFX subsidiary established by media group NEW in July last year that is actively developing the latest innovations in media technology, which naturally include virtual production. IM Jae-ho, who leads the company’s efforts in the development of VP tools as chief researcher of the R&D unit Lab51, talks about what he worked on over the past year at eNgine and the latest developments in virtual production in Korea and abroad.

As eNgine enters its second year in business, could you tell us what kept you busy during for the past year?
We did the previs concepts for DELIVER US FROM EVIL (2020), and also worked on the visual effects for SF8 (2020). We also worked on an augmented reality project that started with the 2019 edition of the Melon Music Awards. Apart from these, we did some research and development in virtual production and digital humans, and although they have not been announced yet, there are also several other projects that are progressing well.

And what are you currently working on?
We are investing a lot in two major areas. The first is R&D on digital humans and the development of tools applicable to production, which is supervised by SONG Jae-won, and the second, which I am in charge of, is to develop our research and engineering capacities regarding virtual production. In the case of digital humans, a large number of companies have shown finished products over the past year. The internal objective we set for ourselves and are currently working toward is to create digital humans that show emotions and can speak flawlessly. Many companies are jumping on the bandwagon of virtual production. In a way, you could say that, in 2020, virtual production was the trending topic in the industry. We focused our attention on techniques that save time in designing a scene for a similar result, techniques that accommodate more experiences.


Among eNgines achievements, one thing that immediately stands out is the virtual AR studio stages for the 2019 Melon Music Awards. The stage designs of several performances, including the one for BTS song Mikrokosmos, were created with the Unreal Engine 4 real-time game engine from Epic Games. Is there any reason your first work was an AR project?
The 2019 Melon Music Awards was the first project we took at the R&D unit after the company was founded. The previs unit produced a pre-rendering of the concept that provided the general stage direction and gave an idea of what the final product would look like, and then we produced the actual visuals based on these concept images. This process was successfully employed to create the virtual stages for BTS’ Mikrokosmos, the large whale flying above the stage for TXT, the eagle for Chungha, and the 3D text for ITZY. We started working on this AR project because, when looking at the trends in the industry, there was an internal consensus that we were going to see a growing trend for AR, VR, and new media.

Are you working on projects related to some of the franchises of NEW?
We are planning and developing VR and AR experiences and short animations based on the Peninsula (2020) and TRAIN TO BUSAN (2016) franchise. One of the best examples you can find abroad is The Walking Dead franchise. There is a VR game set in the fictional world of the comics and the TV series called The Walking Dead: Saints & Sinners. The comic books, the series, and the game share a same fictional world. The Korean-style post-apocalyptic experience we are designing based on TRAIN TO BUSAN (2016) and Peninsula (2020) will adopt a similar form.


Amid the COVID-19 pandemic, interest in virtual production has been growing as it takes film production beyond the limits of the physical world. Could you feel that interest was growing along with a more practical approach to filmmaking?
This is something I prefer to be prudent about. It would be difficult to establish how much VP has grown around the globe or if Korea is on par with that level of development. It is true that VP is now accepted as something valuable when looking at the global trends. Above all, with Disney’s The Mandalorian, the LED in-camera VFX technique, which consists in creating and filming a LED wall setup [Ed: that show digital backgrounds produced in real-time by a game engine, rather than an actual set], has been trending. Overseas, they are making constant efforts to create huge-scale sets with this technology and, by doing so, to save on the production costs. Hollywood is the current front-runner, while the UK and other countries are boosting their own technological prowess. Meanwhile, it seems that in Korea such techniques are still in their infancy. I heard that the Netflix series Sweet Home used this technology, but since it has not been revealed which VP technique was used, nor for which scene, it is difficult to conclude where Korea stands in terms of VP. 

Although there is a wider awareness of the importance of virtual production, changing the current filming paradigm will not be easy. It will be important to demonstrate to the creators the necessity of VP and present them with concrete examples of its applications. What is your strategy in this regard?
That aspect seems to be the most difficult. It is never easy to abandon the tools you have grown accustomed to, so this is a matter of being able to use VP without altering the way we make films. To do so, there are still many mountains to overcome. And that’s because we have yet to see a success story. Spending your budget on a technology that has yet to produce a success story can feel like a gamble. That’s why it is important for us to create that successful example first, and it’s imperative to implement VP while preserving the old way of working in the industry. Currently, these are the two aspects we are focusing on.

To build the proper environment for VP requires a variety of technologies such as motion capture to translate the camera position into the digital environment, real-time 3D rendering and LED video display, so cooperation with specialized companies in the respective field must play an important role.
SONG Jae-won is a specialist with a PhD in performance and facial capture. At our company, he takes great pride in his high technical skills. In addition, we signed memorandums of understanding with other companies that allow us to share technologies we could all benefit from, and we are in contact with graduate schools like Korea University and KAIST that have the expertise and know-how for all matters regarding computer graphics.


What would you say differentiates eNgine from other VFX studios when it comes to the VP deployment and R&D?
What is different with eNgine is that we have the ability to fully develop technical tools from their initial concept to a product that can be used by the industry. Papers published as part of international computer graphics conferences such as Siggraph have produced many results on the research level. But these technologies are not always directly relevant to the industry. Some of them can only be implanted after overhauling many of their components. Since I and SONG Jae-won have a PhD in computer graphics, we have the ability to read, understand, and apply the conclusions of these technical papers. We refine the rough ideas introduced by the corresponding papers into something that can actually be used in film production. Once the technology we refined is passed on to the technical artists, they can decide by themselves how to best implement the tools or processes they have access to. eNgine’s R&D team has the particularity of being able to be involved at every step of the development of the technologies, from the moment they are still proofs of concept to the point where they are consumer products that can be applied in film production.

What are your objectives as a researcher?
Our team’s goal is to make the name eNgine Visual Wave be the first to come to mind in Korea when someone says “virtual production”. Our goal is to be a world-class VP company, the one that makes the best use of virtual production in Korea and produces the most with VP.
Any copying, republication or redistribution of KOFIC's content is prohibited without prior consent of KOFIC.
Related Films Related Company