Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics, offering information and knowledge of the Big Data.




시작할 준비가 되셨습니까?

Sandbox 다운로드

어떤 도움이 필요하십니까?

닫기닫기 버튼

Maximize the Value of Data-in-Motion
With Big Data from the Internet of Things

동영상 동영상 버튼 재생

클라우드 시작할 준비가 되셨습니까?

HDF 다운로드
Hortonworks DataFlow(HDF™)

Hortonworks DataFlow(HDF)

Hortonworks DataFlow (HDF) provides the only end-to-end platform that collects, curates, analyzes and acts on data in real-time, on-premises or in the cloud, with a drag-and-drop visual interface. HDF is an integrated solution with Apache Nifi/MiNifi, Apache Kafka, Apache Storm and Druid.

The HDF streaming data analytics platform includes data Flow Management, Stream Processing, and Enterprise Services.

Powering the Future of Data
구독 시작

HDF Data-In-Motion Platform

Three Major Components of Hortonworks DataFlow


Easy, Secure, and Reliable Way to Manage Data Flow 

Collect and manipulate data flows securely and efficiently while giving real-time operational visibility, control, and management.


Immediate and Continuous Insights  

Build streaming analytics applications in minutes to capture perishable insights in real-time without writing a single line of code.

자세히 알아보기

Corporate Governance, Security and Operations 

Manage the HDF and HDP ecosystem with comprehensive management panel for provisioning, monitoring, and governance.

자세히 알아보기

데이터 소스를 알 수 없는 통합 수집 플랫폼

HDF has full featured data collection capabilities that are streaming data agnostic and integrated with over 220 processors. Data can be collected from dynamic and distributed sources of differing formats, schemas, protocols, speeds and sizes and from types such as machines, geo location devices, click streams, files, social feeds, log files and videos.

추가 정보:

  • How real-time data-source agnostic dataflow management makes data movement easy
    Watch Video
    Learn More
    Learn what HDF can do to optimize log analytics from the Edge.Read More
Powerful Data Collection


With HDF, data collection is no longer a tedious process. You can manage data in full flight with a visual control panel to adjust sources, join and split streams, and prioritize data flow. HDF also can add contextual data to your streams for more complete analysis and insight. The always-on data provenance and audit trails provides security and governance compliance and troubleshooting as necessary in real-time. Integrated with Apache NiFi, MiNiFi, Kafka and Storm, HDF is ready for high volume event processing for immediate analysis and action. Kafka allows differing rates of data creation and delivery while Storm provides real-time streaming analytics and immediate insights at a massive scale.

추가 정보:

  • Apache NiFi의 실시간 시각적 사용자 인터페이스를 통해 관리되는 스트리밍 데이터로 운영 효율성을 개선합니다.
    동영상 보기
Real-Time Data Flow Management


HDF secures end-to-end data flow and routing from source to destination with discrete user authorization and detailed, real-time visual chain of custody. Use the visual user interface of HDF to encrypt streaming data, route it to Kafka, configure buffers and manage congestion so that data can be dynamically prioritized and securely sent. HDF enables role-based data access that allows enterprises to dynamically and securely share select pieces of pertinent data. HDF can easily deploy flow management and streaming applications in a Kerberized environment without much operational overhead.

추가 정보:

  • See how granular access of data is better than role based access
    Watch Video
Enterprise-Grade Security


HDF includes a complete streaming analytics module, Streaming Analytics Manager (SAM), to build streaming analytics applications that do event correlation, context enrichment, complex pattern matching, analytical aggregations and create alerts/notifications when insights are discovered. SAM makes building streaming analytics easy for application deverlopers, DevOps and business analysts to build, develop, collaborate, analyze, deploy, manage applications in minutes without writing a single line of code. Analysts use pre-built charts to quickly build analysis and create dashboards, while DevOps can manage and monitor the applications performance right out of the box.

추가 정보:


HDF includes, Schema Registry, a central schema repository that allows analytics applications to flexibly interact with each other. This enables users to save, edit, or retrieve schemas for the data they need. This also allows easy attachment of schemas to each data without incurring additional overhead for greater operational efficiency. With schema version management, data consumers and data producers can evolve at different rates. And, through schema validation, data quality is greatly improved. A central schema registry also provides for greater governance of how data is used. Schema Registry is integrated with Apache Nifi and HDF Streaming Analytics Manager.

추가 정보:


Build Analytics Faster with Streaming Analytics Manager


Build analytics applications easily with drag and drop visual paradigm with drop down analytics functions


Analyze quickly with rich visual dashboard and an analytics engine powered by Druid


Operate efficiently with prebuilt monitoring dashboards of system metrics

Manage Data Flows More Easily with Schema Registry


Eliminate the need to code and attach schema to every piece of data and reduce operational overhead


Allow data consumers and producers to evolve at different rates with schema version management


Store schema for any type of entity or data store, not just Kafka

HDF 사용자 안내서

HDF 릴리스 정보, 사용자 안내서, 개발자 안내서 및 시작 안내서를 받으세요.


엔터프라이즈의 Apache NiFi, Kafka 및 Storm에 대해 업계 최고의 지원을 제공합니다. 여정에 대한 도움을 받으려면 당사의 팀 전문가에게 연결하세요.


빅 데이터 전문가로부터 실제 교육을 받아 보세요. 필요할 때마다 개인적으로 또는 온디맨드로 이용할 수 있습니다.

시작할 준비가 되셨습니까?