Entertainer.newsEntertainer.news
  • Home
  • Celebrity
  • Movies
  • Music
  • Web Series
  • Podcast
  • OTT
  • Television
  • Interviews
  • Awards

Subscribe to Updates

Get the latest Entertainment News and Updates from Entertainer News

What's Hot

Final Fantasy 7 Rebirth Free DLC Launches Ahead Of New Release

May 3, 2026

‘The Valley’ Tom Schwartz Went On ‘Healing’ Journey Before Love

May 3, 2026

Angie Kent shares news her dad has died, and gives update on fertility solo mum by choice journey

May 3, 2026
Facebook Twitter Instagram
Sunday, May 3
  • About us
  • Advertise with us
  • Submit Articles
  • Privacy Policy
  • Contact us
Facebook Twitter Tumblr LinkedIn
Entertainer.newsEntertainer.news
Subscribe Login
  • Home
  • Celebrity
  • Movies
  • Music
  • Web Series
  • Podcast
  • OTT
  • Television
  • Interviews
  • Awards
Entertainer.newsEntertainer.news
Home Pushy to the Limit: Evolving Netflix’s WebSocket proxy for the future | by Netflix Technology Blog | Sep, 2024
Web Series

Pushy to the Limit: Evolving Netflix’s WebSocket proxy for the future | by Netflix Technology Blog | Sep, 2024

Team EntertainerBy Team EntertainerSeptember 10, 2024Updated:September 11, 2024No Comments21 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Pushy to the Limit: Evolving Netflix’s WebSocket proxy for the future | by Netflix Technology Blog | Sep, 2024
Share
Facebook Twitter LinkedIn Pinterest Email


Netflix Technology Blog
Netflix TechBlog

By Karthik Yagna, Baskar Odayarkoil, and Alex Ellis

Pushy is Netflix’s WebSocket server that maintains persistent WebSocket connections with gadgets operating the Netflix utility. This permits knowledge to be despatched to the gadget from backend providers on demand, with out the necessity for regularly polling requests from the gadget. Over the previous few years, Pushy has seen large development, evolving from its function as a best-effort message supply service to be an integral a part of the Netflix ecosystem. This publish describes how we’ve grown and scaled Pushy to satisfy its new and future wants, because it handles tons of of hundreds of thousands of concurrent WebSocket connections, delivers tons of of 1000’s of messages per second, and maintains a gentle 99.999% message supply reliability fee.

There have been two important motivating use circumstances that drove Pushy’s preliminary improvement and utilization. The primary was voice management, the place you may play a title or search utilizing your digital assistant with a voice command like “Present me Stranger Issues on Netflix.” (See Easy methods to use voice controls with Netflix if you wish to do that your self!).

If we contemplate the Alexa use case, we will see how this partnership with Amazon enabled this to work. As soon as they obtain the voice command, we permit them to make an authenticated name by means of apiproxy, our streaming edge proxy, to our inside voice service. This name consists of metadata, such because the consumer’s data and particulars in regards to the command, corresponding to the precise present to play. The voice service then constructs a message for the gadget and locations it on the message queue, which is then processed and despatched to Pushy to ship to the gadget. Lastly, the gadget receives the message, and the motion, corresponding to “Present me Stranger Issues on Netflix”, is carried out. This preliminary performance was constructed out for FireTVs and was expanded from there.

Sample system diagram for an Alexa voice command, with the voice command entering Netflix’s cloud infrastructure via apiproxy and existing via a server-side message through Pushy to the device.
Pattern system diagram for an Alexa voice command. The place aws ends and the web begins is an train left to the reader.

The opposite important use case was RENO, the Speedy Occasion Notification System talked about above. Earlier than the combination with Pushy, the TV UI would repeatedly ballot a backend service to see if there have been any row updates to get the most recent data. These requests would occur each few seconds, which ended up creating extraneous requests to the backend and have been pricey for gadgets, that are steadily useful resource constrained. The combination with WebSockets and Pushy alleviated each of those factors, permitting the origin service to ship row updates as they have been prepared, leading to decrease request charges and value financial savings.

For extra background on Pushy, you may see this InfoQ speak by Susheel Aroskar. Since that presentation, Pushy has grown in each measurement and scope, and this text shall be discussing the investments we’ve made to evolve Pushy for the following era of options.

This integration was initially rolled out for Hearth TVs, PS4s, Samsung TVs, and LG TVs, resulting in a attain of about 30 million candidate gadgets. With these clear advantages, we continued to construct out this performance for extra gadgets, enabling the identical effectivity wins. As of immediately, we’ve expanded our record of candidate gadgets even additional to just about a billion gadgets, together with cellular gadgets operating the Netflix app and the web site expertise. We’ve even prolonged help to older gadgets that lack fashionable capabilities, like help for TLS and HTTPS requests. For these, we’ve enabled safe communication from consumer to Pushy by way of an encryption/decryption layer on every, permitting for confidential messages to circulate between the gadget and server.

Progress

With that prolonged attain, Pushy has gotten busier. Over the past 5 years, Pushy has gone from tens of hundreds of thousands of concurrent connections to tons of of hundreds of thousands of concurrent connections, and it usually reaches 300,000 messages despatched per second. To help this development, we’ve revisited Pushy’s previous assumptions and design choices with an eye fixed in direction of each Pushy’s future function and future stability. Pushy had been comparatively hands-free operationally over the previous few years, and as we up to date Pushy to suit its evolving function, our purpose was additionally to get it right into a secure state for the following few years. That is notably necessary as we construct out new performance that depends on Pushy; a powerful, secure infrastructure basis permits our companions to proceed to construct on prime of Pushy with confidence.

All through this evolution, we’ve been capable of keep excessive availability and a constant message supply fee, with Pushy efficiently sustaining 99.999% reliability for message supply over the previous few months. When our companions wish to ship a message to a tool, it’s our job to verify they will accomplish that.

Listed below are a number of of the methods we’ve advanced Pushy to deal with its rising scale.

A few of the related services in Pushy’s immediate ecosystem and the changes we’ve made for them.
A couple of of the associated providers in Pushy’s quick ecosystem and the adjustments we’ve made for them.

Message processor

One facet that we invested in was the evolution of the asynchronous message processor. The earlier model of the message processor was a Mantis stream-processing job that processed messages from the message queue. It was very environment friendly, nevertheless it had a set job measurement, requiring guide intervention if we needed to horizontally scale it, and it required guide intervention when rolling out a brand new model.

It served Pushy’s wants nicely for a few years. As the dimensions of the messages being processed elevated and we have been making extra code adjustments within the message processor, we discovered ourselves searching for one thing extra versatile. Specifically, we have been searching for a few of the options we take pleasure in with our different providers: automated horizontal scaling, canaries, automated pink/black rollouts, and extra observability. With this in thoughts, we rewrote the message processor as a standalone Spring Boot service utilizing Netflix paved-path parts. Its job is similar, nevertheless it does so with straightforward rollouts, canary configuration that lets us roll adjustments safely, and autoscaling insurance policies we’ve outlined to let it deal with various volumes.

Rewriting all the time comes with a danger, and it’s by no means the primary resolution we attain for, notably when working with a system that’s in place and dealing nicely. On this case, we discovered that the burden from sustaining and bettering the customized stream processing job was rising, and we made the judgment name to do the rewrite. A part of the rationale we did so was the clear function that the message processor performed — we weren’t rewriting an enormous monolithic service, however as a substitute a well-scoped element that had specific targets, well-defined success standards, and a transparent path in direction of enchancment. Because the rewrite was accomplished in mid-2023, the message processor element has been fully zero contact, fortunately automated and operating reliably by itself.

Push Registry

For many of its life, Pushy has used Dynomite for preserving monitor of gadget connection metadata in its Push Registry. Dynomite is a Netflix open supply wrapper round Redis that gives a number of further options like auto-sharding and cross-region replication, and it offered Pushy with low latency and simple document expiry, each of that are important for Pushy’s workload.

As Pushy’s portfolio grew, we skilled some ache factors with Dynomite. Dynomite had nice efficiency, nevertheless it required guide scaling because the system grew. The oldsters on the Cloud Information Engineering (CDE) staff, those constructing the paved path for inside knowledge at Netflix, graciously helped us scale it up and make changes, nevertheless it ended up being an concerned course of as we saved rising.

These ache factors coincided with the introduction of KeyValue, which was a brand new providing from the CDE staff that’s roughly “HashMap as a service” for Netflix builders. KeyValue is an abstraction over the storage engine itself, which permits us to decide on the most effective storage engine that meets our SLO wants. In our case, we worth low latency — the quicker we will learn from KeyValue, the quicker these messages can get delivered. With CDE’s assist, we migrated our Push Registry to make use of KV as a substitute, and we’ve got been extraordinarily glad with the consequence. After tuning our retailer for Pushy’s wants, it has been on autopilot since, appropriately scaling and serving our requests with very low latency.

Scaling Pushy horizontally and vertically

Many of the different providers our staff runs, like apiproxy, the streaming edge proxy, are CPU certain, and we’ve got autoscaling insurance policies that scale them horizontally after we see a rise in CPU utilization. This maps nicely to their workload — extra HTTP requests means extra CPU used, and we will scale up and down accordingly.

Pushy has barely totally different efficiency traits, with every node sustaining many connections and delivering messages on demand. In Pushy’s case, CPU utilization is constantly low, since a lot of the connections are parked and ready for an occasional message. As an alternative of counting on CPU, we scale Pushy on the variety of connections, with exponential scaling to scale quicker after larger thresholds are reached. We load steadiness the preliminary HTTP requests to ascertain the connections and depend on a reconnect protocol the place gadgets will reconnect each half-hour or so, with some staggering, that offers us a gentle stream of reconnecting gadgets to steadiness connections throughout all obtainable situations.

For a number of years, our scaling coverage had been that we might add new situations when the common variety of connections reached 60,000 connections per occasion. For a pair hundred million gadgets, this meant that we have been usually operating 1000’s of Pushy situations. We are able to horizontally scale Pushy to our coronary heart’s content material, however we might be much less content material with our invoice and must shard Pushy additional to get round NLB connection limits. This evolution effort aligned nicely with an inside deal with value effectivity, and we used this as a chance to revisit these earlier assumptions with an eye fixed in direction of effectivity.

Each of those could be helped by rising the variety of connections that every Pushy node may deal with, decreasing the overall variety of Pushy situations and operating extra effectively with the best steadiness between occasion sort, occasion value, and most concurrent connections. It might additionally permit us to have extra respiratory room with the NLB limits, decreasing the toil of further sharding as we proceed to develop. That being mentioned, rising the variety of connections per node just isn’t with out its personal drawbacks. When a Pushy occasion goes down, the gadgets that have been linked to it’s going to instantly attempt to reconnect. By rising the variety of connections per occasion, it signifies that we might be rising the variety of gadgets that might be instantly attempting to reconnect. We may have one million connections per occasion, however a down node would result in a thundering herd of one million gadgets reconnecting on the identical time.

This delicate steadiness led to us doing a deep analysis of many occasion sorts and efficiency tuning choices. Placing that steadiness, we ended up with situations that deal with a median of 200,000 connections per node, with respiratory room to go as much as 400,000 connections if we needed to. This makes for a pleasant steadiness between CPU utilization, reminiscence utilization, and the thundering herd when a tool connects. We’ve additionally enhanced our autoscaling insurance policies to scale exponentially; the farther we’re previous our goal common connection rely, the extra situations we’ll add. These enhancements have enabled Pushy to be nearly fully palms off operationally, giving us loads of flexibility as extra gadgets come on-line in numerous patterns.

Reliability & constructing a secure basis

Alongside these efforts to scale Pushy for the longer term, we additionally took an in depth take a look at our reliability after discovering some connectivity edge circumstances throughout current function improvement. We discovered a number of areas for enchancment across the connection between Pushy and the gadget, with failures because of Pushy making an attempt to ship messages on a connection that had failed with out notifying Pushy. Ideally one thing like a silent failure wouldn’t occur, however we steadily see odd consumer conduct, notably on older gadgets.

In collaboration with the consumer groups, we have been capable of make some enhancements. On the consumer aspect, higher connection dealing with and enhancements across the reconnect circulate meant that they have been extra more likely to reconnect appropriately. In Pushy, we added further heartbeats, idle connection cleanup, and higher connection monitoring, which meant that we have been preserving round fewer and fewer stale connections.

Whereas these enhancements have been principally round these edge circumstances for the function improvement, they’d the aspect advantage of bumping our message supply charges up even additional. We already had an excellent message supply fee, however this extra bump has enabled Pushy to usually common 5 9s of message supply reliability.

Push message delivery success rate over a recent 2-week period, staying consistently over 5 9s of reliability.
Push message supply success fee over a current 2-week interval.

With this secure basis and all of those connections, what can we now do with them? This query has been the driving pressure behind practically all the current options constructed on prime of Pushy, and it’s an thrilling query to ask, notably as an infrastructure staff.

Shift in direction of direct push

The primary change from Pushy’s conventional function is what we name direct push; as a substitute of a backend service dropping the message on the asynchronous message queue, it may possibly as a substitute leverage the Push library to skip the asynchronous queue fully. When known as to ship a message within the direct path, the Push library will lookup the Pushy linked to the goal gadget within the Push Registry, then ship the message on to that Pushy. Pushy will reply with a standing code reflecting whether or not it was capable of efficiently ship the message or it encountered an error, and the Push library will bubble that as much as the calling code within the service.

The system diagram for the direct and indirect push paths. The direct push path goes directly from a backend service to Pushy, while the indirect path goes to a decoupled message queue, which is then handled by a message processor and sent on to Pushy.
The system diagram for the direct and oblique push paths.

Susheel, the unique creator of Pushy, added this performance as an non-compulsory path, however for years, practically all backend providers relied on the oblique path with its “best-effort” being ok for his or her use circumstances. In recent times, we’ve seen utilization of this direct path actually take off because the wants of backend providers have grown. Specifically, fairly than being simply greatest effort, these direct messages permit the calling service to have quick suggestions in regards to the supply, letting them retry if a tool they’re focusing on has gone offline.

Nowadays, messages despatched by way of direct push make up the vast majority of messages despatched by means of Pushy. For instance, for a current 24 hour interval, direct messages averaged round 160,000 messages per second and oblique averaged at round 50,000 messages per second..

Graph of direct vs indirect messages per second, showing around 150,000 direct messages per second and around 50,000 indirect messages per second.
Graph of direct vs oblique messages per second.

System to gadget messaging

As we’ve thought by means of this evolving use case, our idea of a message sender has additionally advanced. What if we needed to maneuver previous Pushy’s sample of delivering server-side messages? What if we needed to have a tool ship a message to a backend service, or perhaps even to a different gadget? Our messages had historically been unidirectional as we ship messages from the server to the gadget, however we now leverage these bidirectional connections and direct gadget messaging to allow what we name gadget to gadget messaging. This gadget to gadget messaging supported early phone-to-TV communication in help of video games like Triviaverse, and it’s the messaging basis for our Companion Mode as TVs and telephones talk forwards and backwards.

A screenshot of one of the authors playing Triviaquest with a mobile device as the controller.
A screenshot of one of many authors taking part in Triviaquest with a cellular gadget because the controller.

This requires larger degree information of the system, the place we have to know not simply details about a single gadget, however extra broader data, like what gadgets are linked for an account that the cellphone can pair with. This additionally permits issues like subscribing to gadget occasions to know when one other gadget comes on-line and once they’re obtainable to pair or ship a message to. This has been constructed out with an extra service that receives gadget connection data from Pushy. These occasions, despatched over a Kafka subject, let the service maintain monitor of the gadget record for a given account. Gadgets can subscribe to those occasions, permitting them to obtain a message from the service when one other gadget for a similar account comes on-line.

Pushy and its relationship with the Device List Service for discovering other devices. Pushy reaches out to the Device List Service, and when it receives the device list in response, propagates that back to the requesting device.
Pushy and its relationship with the System Listing Service for locating different gadgets.

This gadget record permits the discoverability facet of those gadget to gadget messages. As soon as the gadgets have this data of the opposite gadgets linked for a similar account, they’re in a position to decide on a goal gadget from this record that they will then ship messages to.

As soon as a tool has that record, it may possibly ship a message to Pushy over its WebSocket reference to that gadget because the goal in what we name a gadget to gadget message (1 within the diagram under). Pushy seems up the goal gadget’s metadata within the Push registry (2) and sends the message to the second Pushy that the goal gadget is linked to (3), as if it was the backend service within the direct push sample above. That Pushy delivers the message to the goal gadget (4), and the unique Pushy will obtain a standing code in response, which it may possibly move again to the supply gadget (5).

A basic order of events for a device to device message.
A primary order of occasions for a tool to gadget message.

The messaging protocol

We’ve outlined a primary JSON-based message protocol for gadget to gadget messaging that lets these messages be handed from the supply gadget to the goal gadget. As a networking staff, we naturally lean in direction of abstracting the communication layer with encapsulation wherever doable. This generalized message signifies that gadget groups are capable of outline their very own protocols on prime of those messages — Pushy would simply be the transport layer, fortunately forwarding messages forwards and backwards.

A simple block diagram showing the client app protocol on top of the device to device protocol, which itself is on top of the WebSocket & Pushy protocol.
The consumer app protocol, constructed on prime of the gadget to gadget protocol, constructed on prime of Pushy.

This generalization paid off by way of funding and operational help. We constructed the vast majority of this performance in October 2022, and we’ve solely wanted small tweaks since then. We would have liked practically no modifications as consumer groups constructed out the performance on prime of this layer, defining the upper degree application-specific protocols that powered the options they have been constructing. We actually do take pleasure in working with our accomplice groups, but when we’re capable of give them the liberty to construct on prime of our infrastructure layer with out us getting concerned, then we’re capable of improve their velocity, make their lives simpler, and play our infrastructure roles as message platform suppliers.

With early options in experimentation, Pushy sees a median of 1000 gadget to gadget messages per second, a quantity that can solely proceed to develop.

Graph of device to device messages per second, showing an average of 1000 messages per second.
Graph of gadget to gadget messages per second.

The Netty-gritty particulars

In Pushy, we deal with incoming WebSocket messages in our PushClientProtocolHandler (code pointer to class in Zuul that we prolong), which extends Netty’s ChannelInboundHandlerAdapter and is added to the Netty pipeline for every consumer connection. We hear for incoming WebSocket messages from the linked gadget in its channelRead technique and parse the incoming message. If it’s a tool to gadget message, we move the message, the ChannelHandlerContext, and the PushUserAuth details about the connection’s identification to our DeviceToDeviceManager.

A rough overview of the internal organization for these components, with the code classes described above. Inside Pushy, a Push Client Protocol handler inside a Netty Channel calls out to the Device to Device manager, which itself calls out to the Push Message Sender class that forwards the message on to the other Pushy.
A tough overview of the inner group for these parts.

The DeviceToDeviceManager is accountable for validating the message, doing a little bookkeeping, and kicking off an async name that validates that the gadget is a certified goal, seems up the Pushy for the goal gadget within the native cache (or makes a name to the information retailer if it’s not discovered), and forwards on the message. We run this asynchronously to keep away from any occasion loop blocking because of these calls. The DeviceToDeviceManager can be accountable for observability, with metrics round cache hits, calls to the information retailer, message supply charges, and latency percentile measurements. We’ve relied closely on these metrics for alerts and optimizations — Pushy actually is a metrics service that sometimes will ship a message or two!

Safety

As the sting of the Netflix cloud, safety concerns are all the time prime of thoughts. With each connection over HTTPS, we’ve restricted these messages to only authenticated WebSocket connections, added fee limiting, and added authorization checks to make sure that a tool is ready to goal one other gadget — you’ll have the most effective intentions in thoughts, however I’d strongly favor it for those who weren’t capable of ship arbitrary knowledge to my private TV from yours (and vice versa, I’m positive!).

Latency and different concerns

One important consideration with the merchandise constructed on prime of that is latency, notably when this function is used for something interactive inside the Netflix app.

We’ve added caching to Pushy to cut back the variety of lookups within the hotpath for issues which might be unlikely to vary steadily, like a tool’s allowed record of targets and the Pushy occasion the goal gadget is linked to. Now we have to do some lookups on the preliminary messages to know the place to ship them, nevertheless it permits us to ship subsequent messages quicker with none KeyValue lookups. For these requests the place caching eliminated KeyValue from the recent path, we have been capable of vastly velocity issues up. From the incoming message arriving at Pushy to the response being despatched again to the gadget, we lowered median latency to lower than a millisecond, with the 99th percentile of latency at lower than 4ms.

Our KeyValue latency is normally very low, however we’ve got seen temporary intervals of elevated learn latencies because of underlying points in our KeyValue datastore. General latencies elevated for different elements of Pushy, like consumer registration, however we noticed little or no improve in gadget to gadget latency with this caching in place.

Pushy’s scale and system design concerns make the work technically fascinating, however we additionally intentionally deal with non-technical points which have helped to drive Pushy’s development. We deal with iterative improvement that solves the toughest drawback first, with tasks steadily beginning with fast hacks or prototypes to show out a function. As we do that preliminary model, we do our greatest to maintain an eye fixed in direction of the longer term, permitting us to maneuver rapidly from supporting a single, centered use case to a broad, generalized resolution. For instance, for our cross-device messaging, we have been capable of resolve exhausting issues within the early work for Triviaverse that we later leveraged for the generic gadget to gadget resolution.

As one can instantly see within the system diagrams above, Pushy doesn’t exist in a vacuum, with tasks steadily involving not less than half a dozen groups. Belief, expertise, communication, and robust relationships all allow this to work. Our staff wouldn’t exist with out our platform customers, and we definitely wouldn’t be right here scripting this publish with out all the work our product and consumer groups do. This has additionally emphasised the significance of constructing and sharing — if we’re capable of get a prototype along with a tool staff, we’re capable of then present it off to seed concepts from different groups. It’s one factor to say which you can ship these messages, nevertheless it’s one other to indicate off the TV responding to the primary click on of the cellphone controller button!

If there’s something sure on this world, it’s that Pushy will proceed to develop and evolve. Now we have many new options within the works, like WebSocket message proxying, WebSocket message tracing, a world broadcast mechanism, and subscription performance in help of Video games and Reside. With all of this funding, Pushy is a secure, bolstered basis, prepared for this subsequent era of options.

We’ll be writing about these new options as nicely — keep tuned for future posts.

Particular because of our gorgeous colleagues Jeremy Kelly and Justin Guerra who’ve each been invaluable to Pushy’s development and the WebSocket ecosystem at massive. We’d additionally prefer to thank our bigger groups and our quite a few companions for his or her nice work; it really takes a village!



Source link

Blog evolving future Limit Netflix Netflixs Proxy Pushy Sep Technology WebSocket
Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleMalaa Declares War On Himself With Alter Ego Reveal, New Double Album
Next Article Below Deck Med star Carrie O’Neill reveals unseen Captain Sandy Yawn talk about Joe Bradley
Team Entertainer
  • Website

Related Posts

Narnia Release Date Delayed, Netflix & Greta Gerwig Movie Gets Major Update

May 2, 2026

State of Routing in Model Serving | by Netflix Technology Blog | May, 2026

May 2, 2026

Netflix makes content discovery easy with new Instagram like vertical video feed

May 1, 2026

8 Netflix Series That Are So Good, You’ll Want to Binge Them in One Weekend

May 1, 2026
Recent Posts
  • Final Fantasy 7 Rebirth Free DLC Launches Ahead Of New Release
  • ‘The Valley’ Tom Schwartz Went On ‘Healing’ Journey Before Love
  • Angie Kent shares news her dad has died, and gives update on fertility solo mum by choice journey
  • Wrestler Tanea Brooks Gets On-Air Tribute Amid ALS Diagnosis

Archives

  • May 2026
  • April 2026
  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021

Categories

  • Actress
  • Awards
  • Behind the Camera
  • BollyBuzz
  • Celebrity
  • Edit Picks
  • Glam & Style
  • Global Bollywood
  • In the Frame
  • Insta Inspector
  • Interviews
  • Movies
  • Music
  • News
  • News & Gossip
  • News & Gossips
  • OTT
  • Podcast
  • Power & Purpose
  • Press Release
  • Spotlight Stories
  • Spotted!
  • Star Luxe
  • Television
  • Trending
  • Uncategorized
  • Web Series
NAVIGATION
  • About us
  • Advertise with us
  • Submit Articles
  • Privacy Policy
  • Contact us
  • About us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact us
Copyright © 2026 Entertainer.

Type above and press Enter to search. Press Esc to cancel.

Sign In or Register

Welcome Back!

Login to your account below.

Lost password?